All Apps and Add-ons

Cloud Snowflake DB Connect Integration

ekcsoc
Path Finder

Hi,

Could you please help here I tried to integrate the Snowflake with db connect.

I am using following version of JRE: /usr/java/jdk1.8.0_131/jre

I have placed snowflake at following location:

/opt/splunk/etc/apps/splunk_app_db_connect/drivers/snowflake-jdbc-3.9.1-javadoc.jar
/opt/splunk/etc/apps/splunk_app_db_connect/drivers/snowflake-libs/snowflake-jdbc-3.9.1-javadoc.jar
/opt/splunk/etc/apps/splunk_app_db_connect/jars/snowflake-jdbc-3.9.1-javadoc.jar

I have put below stanza in /opt/splunk/etc/apps/splunk_app_db_connect/local/ db_connection_types.conf

[snowflake]
displayName = Snowflake
serviceClass = com.splunk.dbx2.DefaultDBX2JDBC
jdbcUrlFormat = jdbc:snowflake://:/
jdbcDriverClass = com.snowflake.client.jdbc.SnowflakeDriver

Now when I create a connection and try to save I get following error:

Database connection snowflake is invalid
No suitable driver found for jdbc:snowflake://"":443/

Labels (1)
1 Solution

ptang_splunk
Splunk Employee
Splunk Employee

Hi there,

Please try the following steps:

  1. Make sure you download the latest Snowflake JDBC Driver jar version and NOT javadoc.jar : https://docs.snowflake.net/manuals/user-guide/jdbc-download.html#downloading-the-driver OR https://search.maven.org/classic/#search%7Cga%7C1%7Csnowflake-jdbc
  2. Drop the .jar file (driver) under $SPLUNK_HOME/etc/apps/splunk_app_db_connect/drivers
  3. Create or update db_connection_types.conf under $SPLUNK_HOME/etc/apps/splunk_app_db_connect/local with the following: [snowflake] displayName = Snowflake serviceClass = com.splunk.dbx2.DefaultDBX2JDBC jdbcDriverClass = net.snowflake.client.jdbc.SnowflakeDriver jdbcUrlFormat = jdbc:snowflake://<host>:<port>/?db=<database> ui_default_catalog = $database$ port = 443
  4. Restart Splunk
  5. Now, the Snowflake Driver should be available via the UI
  6. Create a new Identity with your Snowflake credentials
  7. Create a new Database Connection for Snowflake with the following:
  8. Snowflake Identity
  9. Snowflake Connection Type
  10. Select the Timezone set/used by your Snowflake Database/Environment
  11. Check the "Edit JDBC URL" checkbox, this will have to be manually provided as per https://docs.snowflake.net/manuals/user-guide/jdbc-configure.html or something similar to jdbc:snowflake://<account_name>.snowflakecomputing.com/?<connection_params> OR jdbc:snowflake://<account_name>.snowflakecomputing.com/?db=<database>&warehouse=<warehouse>&schema=<schema> Replace all the <parameters> to reflect your environment.
  12. Make sure that Read Only is unchecked as this parameter is not available on Snowflake
  13. Fetch size can be left alone

Your Database Connection should look as follows: Screenshot

The JDBC URL can be adapted as per your environment as long as it matches their JDBC Driver Connection String: https://docs.snowflake.net/manuals/user-guide/jdbc-configure.html#jdbc-driver-connection-string!

View solution in original post

phoran
New Member

Can you post your connection string? (Leave out the sensitive parts, obviously...)
What is your Snowflake account? Remember that the Snowflake "account" is everything preceding the ".snowflakecomputing.com" in your account URL. There should be both an account name and a region identifier - like "mySnowflakeAccount.us-east-1." for AWS East. Only Snowflake accounts based in the AWS US-West region will not have a region identifier included.

-Paul Horan-
Snowflake

0 Karma

bwu_hcf
New Member

The JDBC URL is below:

jdbc:snowflake://xxx.ap-southeast-2.snowflakecomputing.com:443/?warehouse=XXXXX_WH&db=XXXXXXXXX

0 Karma

dmaislin_splunk
Splunk Employee
Splunk Employee

I will assume you followed Snowflake’s post on the setup? Have you also reached out to their side for assistance?

https://support.snowflake.net/s/article/Integrating-Snowflake-and-Splunk-with-DBConnect

0 Karma

bwu_hcf
New Member

Thanks for reminding me! That is actually my next step as I am suspecting they could be blocking this type of connection from their end by default.

0 Karma

phoran
New Member

Snowflake will only "block" connection attempts with invalid credentials, or coming from network IP addresses outside your "whitelisted" network policy (should one have been created).
This is more likely an invalid connection string.

-Paul-

0 Karma

bwu_hcf
New Member

Thanks, @phoran ! When I tried to create such connection via Splunk DB Connect GUI, it did show "This is an invalid connection" when I was saving the settings. If I ask our Snowflake counterpart to add our IP to their whitelist, this should resolve the issue. Am I correct?

0 Karma

phoran
New Member

If you're trying to connect from outside the whitelisted IP range, the error message from Snowflake is "IP XX.XX.XX.XX is not allowed to access Snowflake. Contact your local security administrator."

The error you're seeing sounds more like a malformed connectionstring.

-Paul-

0 Karma

dmaislin_splunk
Splunk Employee
Splunk Employee

I see you have a support case open on this correct? Did you follow the guidance they suggested regarding the link and the timeout behavior?

0 Karma

bwu_hcf
New Member

Yes, I had captured tcpdump from Wireshark and will attach the output to the case for further analysis. I have also increased the timeout value to "maxWaitMillis=1800000" to test the connection again, but still no luck. Same error messages as the above internal logs. I am upgrading the jdbc driver to the latest to test again.

0 Karma

dmaislin_splunk
Splunk Employee
Splunk Employee

What is the version of the JDBC driver? Please provide the link to whichever one you downloaded. Thank you.

0 Karma

bwu_hcf
New Member

I am currently using snowflake-jdbc-3.12.4.jar driver downloaded from the site below:

https://repo1.maven.org/maven2/net/snowflake/snowflake-jdbc/

0 Karma

dmaislin_splunk
Splunk Employee
Splunk Employee

Are you using telnet to test from your client PC or from where DBX is installed? Just curious?

0 Karma

bwu_hcf
New Member

Telnet testing is from the Splunk HF where DBX is installed.

0 Karma

ptang_splunk
Splunk Employee
Splunk Employee

Hi there,

Please try the following steps:

  1. Make sure you download the latest Snowflake JDBC Driver jar version and NOT javadoc.jar : https://docs.snowflake.net/manuals/user-guide/jdbc-download.html#downloading-the-driver OR https://search.maven.org/classic/#search%7Cga%7C1%7Csnowflake-jdbc
  2. Drop the .jar file (driver) under $SPLUNK_HOME/etc/apps/splunk_app_db_connect/drivers
  3. Create or update db_connection_types.conf under $SPLUNK_HOME/etc/apps/splunk_app_db_connect/local with the following: [snowflake] displayName = Snowflake serviceClass = com.splunk.dbx2.DefaultDBX2JDBC jdbcDriverClass = net.snowflake.client.jdbc.SnowflakeDriver jdbcUrlFormat = jdbc:snowflake://<host>:<port>/?db=<database> ui_default_catalog = $database$ port = 443
  4. Restart Splunk
  5. Now, the Snowflake Driver should be available via the UI
  6. Create a new Identity with your Snowflake credentials
  7. Create a new Database Connection for Snowflake with the following:
  8. Snowflake Identity
  9. Snowflake Connection Type
  10. Select the Timezone set/used by your Snowflake Database/Environment
  11. Check the "Edit JDBC URL" checkbox, this will have to be manually provided as per https://docs.snowflake.net/manuals/user-guide/jdbc-configure.html or something similar to jdbc:snowflake://<account_name>.snowflakecomputing.com/?<connection_params> OR jdbc:snowflake://<account_name>.snowflakecomputing.com/?db=<database>&warehouse=<warehouse>&schema=<schema> Replace all the <parameters> to reflect your environment.
  12. Make sure that Read Only is unchecked as this parameter is not available on Snowflake
  13. Fetch size can be left alone

Your Database Connection should look as follows: Screenshot

The JDBC URL can be adapted as per your environment as long as it matches their JDBC Driver Connection String: https://docs.snowflake.net/manuals/user-guide/jdbc-configure.html#jdbc-driver-connection-string!

bwu_hcf
New Member

Hi @ptang_splunk ,

Thanks for providing detailed steps above! I had configured our DB Connect inputs for Snowflake as described by your post as well as following the doc below from Snowflake:

https://community.snowflake.com/s/article/Integrating-Snowflake-and-Splunk-with-DBConnect

I am able to telnet to port 443 on the host as specified in the "JDBC URL" from the Splunk HF that the DB connection is initiated. However, I still encountered the below errors from internal logs (splunk_app_db_connect_server.log) after restarting splunkd service:

2020-05-30 00:03:19.220 +1000 [QuartzScheduler_Worker-10] WARN com.zaxxer.hikari.pool.PoolBase - unnamed_pool_1112188830_jdbc_snowflake//XXX.XX-XXXXXXXXX-X.snowflakecomputing.com443/?warehouseSPLUNK_WH&dbSNOWFLAKE - Failed to register management beans.
javax.management.RuntimeOperationsException: null
at com.sun.jmx.mbeanserver.Repository.addMBean(Unknown Source)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerWithRepository(Unknown Source)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerDynamicMBean(Unknown Source)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerObject(Unknown Source)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerMBean(Unknown Source)
at com.sun.jmx.mbeanserver.JmxMBeanServer.registerMBean(Unknown Source)
at com.zaxxer.hikari.pool.PoolBase.registerMBeans(PoolBase.java:258)
at com.zaxxer.hikari.pool.HikariPool.(HikariPool.java:116)
at com.zaxxer.hikari.HikariDataSource.getConnection(HikariDataSource.java:94)
at com.splunk.dbx.connector.ConnectorFactory.getConnectionFromPool(ConnectorFactory.java:201)
at com.splunk.dbx.connector.ConnectorFactory.getConnection(ConnectorFactory.java:189)
at com.splunk.dbx.connector.ConnectorFactory.create(ConnectorFactory.java:174)
at com.splunk.dbx.server.dbinput.recordreader.DbInputRecordReader.executeQuery(DbInputRecordReader.java:60)
at com.splunk.dbx.server.dbinput.recordreader.DbInputRecordReader.open(DbInputRecordReader.java:52)
at org.easybatch.core.job.BatchJob.openReader(BatchJob.java:117)
at org.easybatch.core.job.BatchJob.call(BatchJob.java:74)
at org.easybatch.extensions.quartz.Job.execute(Job.java:59)
at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)
Caused by: java.lang.IllegalArgumentException: Repository: cannot add mbean for pattern name com.zaxxer.hikari:type=PoolConfig (unnamed_pool_1112188830_jdbc
snowflake//XXX.XX-XXXXXXXXX-X.snowflakecomputing.com443/?warehouseSPLUNK_WH&db_SNOWFLAKE)
... 19 common frames omitted

2020-05-30 00:03:21.109 +1000 [QuartzScheduler_Worker-10] ERROR org.easybatch.core.job.BatchJob - Unable to open record reader
java.sql.SQLFeatureNotSupportedException: null
at net.snowflake.client.jdbc.SnowflakeConnectionV1.setReadOnly(SnowflakeConnectionV1.java:518)
at com.zaxxer.hikari.pool.ProxyConnection.setReadOnly(ProxyConnection.java:388)
at com.zaxxer.hikari.pool.HikariProxyConnection.setReadOnly(HikariProxyConnection.java)
at com.splunk.dbx.connector.ConnectorFactory.getConnection(ConnectorFactory.java:190)
at com.splunk.dbx.connector.ConnectorFactory.create(ConnectorFactory.java:174)
at com.splunk.dbx.server.dbinput.recordreader.DbInputRecordReader.executeQuery(DbInputRecordReader.java:60)
at com.splunk.dbx.server.dbinput.recordreader.DbInputRecordReader.open(DbInputRecordReader.java:52)
at org.easybatch.core.job.BatchJob.openReader(BatchJob.java:117)
at org.easybatch.core.job.BatchJob.call(BatchJob.java:74)
at org.easybatch.extensions.quartz.Job.execute(Job.java:59)
at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)

2020-05-30 00:03:21.109 +1000 [QuartzScheduler_Worker-10] INFO org.easybatch.core.job.BatchJob - Job 'SNOWFLAKE_ACCOUNT_QUERY_TEST' finished with status: FAILED

As the above error trace did not give out much information on why the connection failed apart from the 2 messages below:

"Failed to register management beans"
"Unable to open record reader"

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...