Please try the following steps:
Make sure you download the latest Snowflake JDBC Driver jar version and NOT javadoc.jar :
Drop the .jar file (driver) under $SPLUNKHOME/etc/apps/splunkappdbconnect/drivers
Create or update dbconnectiontypes.conf under $SPLUNKHOME/etc/apps/splunkappdbconnect/local with the following:
displayName = Snowflake
serviceClass = com.splunk.dbx2.DefaultDBX2JDBC
jdbcDriverClass = net.snowflake.client.jdbc.SnowflakeDriver
jdbcUrlFormat = jdbc:snowflake://<host>:<port>/?db=<database>
ui_default_catalog = $database$
port = 443
Now, the Snowflake Driver should be available via the UI
Create a new Identity with your Snowflake credentials
Create a new Database Connection for Snowflake with the following:
Snowflake Connection Type
Select the Timezone set/used by your Snowflake Database/Environment
Check the "Edit JDBC URL" checkbox, this will have to be manually provided as per
https://docs.snowflake.net/manuals/user-guide/jdbc-configure.html or something similar to
Replace all the <parameters> to reflect your environment.
Make sure that Read Only is unchecked as this parameter is not available on Snowflake
Fetch size can be left alone
Your Database Connection should look as follows: Screenshot
The JDBC URL can be adapted as per your environment as long as it matches their JDBC Driver Connection String: https://docs.snowflake.net/manuals/user-guide/jdbc-configure.html#jdbc-driver-connection-string!
... View more
btool never lies and this usually means either:
the file is not accessible by the Splunk user
web.conf not in the right location - $SPLUNK_HOME/etc/apps//[default|local]/web.conf
configuration file precedence - http://docs.splunk.com/Documentation/Splunk/7.1.1/Admin/Wheretofindtheconfigurationfiles
perhaps check file md5 checksum - maybe corrupted file?
What path does it point to when using:
splunk cmd btool web list settings --debug
... View more
First: It would be good to know how is your Splunk architecture, especially how are you feeding the Syslog to Splunk.
Are you simply using a standalone instance that does both the Search Head and Indexer? From your post, I would assume it is.
(The reason behind that question is to figure where is your parsing phase (Heavy Forwarder or Indexer), thus the Technical Add-on (TA) will need to be installed on that specific instance as well.)
Since you are able to search sourcetype="cisco:ios" or source="udp:514" , verify if the fields are being extracted accordingly.
Install TA on the Search Head and Heavy Forwarder or Indexers (depending on your data flow)
Install the App on the Search Head only
Second: Syslog event format and data flow - it could be possible the events being received are not in the appropriate format expected by the TA. Feel free to share a raw event and obfuscate any confidential information and/or share us your data flow (ie: Syslog-ng server with UF --> IX).
Third: The next thing I would ask is: where is your data being indexed - which index? If you used a custom index (ie: index=ciscoios), make sure the index is part of your "Indexes searched by default" in your user role. ( `Settings > Access Controls > Roles > yourcurrent_role > Indexes searched by default ). By default, Splunk will make the main` index searched by default if an index is not specified in your SPL search.
Example - This SPL search will only search inside the default searched indexes (default: index=main ):
Fourth: Edit the cisco_ios_index macro (default: index=ios ) to include your index where the data resides. ie: index=ios OR index=your_index
Anyhow, let us know what you figured or require further assistance.
... View more
Could you check if your reports, eventtypes or any other knowledge objects are under your app folder: $SPLUNKHOME/etc/apps/yourapp_name/default or /local?
My first thought would be to verify if your knowledge objects are not Private and they need to be shared to apps. In such case, it won't be part of the package as private objects are under $SPLUNK_HOME/etc/users/...
However, please let me know if that is the case.
... View more