Hi,
Could you please help here I tried to integrate the Snowflake with db connect.
I am using following version of JRE: /usr/java/jdk1.8.0_131/jre
I have placed snowflake at following location:
/opt/splunk/etc/apps/splunk_app_db_connect/drivers/snowflake-jdbc-3.9.1-javadoc.jar
/opt/splunk/etc/apps/splunk_app_db_connect/drivers/snowflake-libs/snowflake-jdbc-3.9.1-javadoc.jar
/opt/splunk/etc/apps/splunk_app_db_connect/jars/snowflake-jdbc-3.9.1-javadoc.jar
I have put below stanza in /opt/splunk/etc/apps/splunk_app_db_connect/local/ db_connection_types.conf
[snowflake]
displayName = Snowflake
serviceClass = com.splunk.dbx2.DefaultDBX2JDBC
jdbcUrlFormat = jdbc:snowflake://:/
jdbcDriverClass = com.snowflake.client.jdbc.SnowflakeDriver
Now when I create a connection and try to save I get following error:
Database connection snowflake is invalid
No suitable driver found for jdbc:snowflake://"":443/
Hi there,
Please try the following steps:
jar
version and NOT javadoc.jar
:
https://docs.snowflake.net/manuals/user-guide/jdbc-download.html#downloading-the-driver
OR https://search.maven.org/classic/#search%7Cga%7C1%7Csnowflake-jdbc[snowflake]
displayName = Snowflake
serviceClass = com.splunk.dbx2.DefaultDBX2JDBC
jdbcDriverClass = net.snowflake.client.jdbc.SnowflakeDriver
jdbcUrlFormat = jdbc:snowflake://<host>:<port>/?db=<database>
ui_default_catalog = $database$
port = 443
jdbc:snowflake://<account_name>.snowflakecomputing.com/?<connection_params>
OR jdbc:snowflake://<account_name>.snowflakecomputing.com/?db=<database>&warehouse=<warehouse>&schema=<schema>
Replace all the <parameters>
to reflect your environment.Read Only
is unchecked as this parameter is not available on SnowflakeFetch size
can be left aloneYour Database Connection should look as follows: Screenshot
The JDBC URL can be adapted as per your environment as long as it matches their JDBC Driver Connection String: https://docs.snowflake.net/manuals/user-guide/jdbc-configure.html#jdbc-driver-connection-string!
Can you post your connection string? (Leave out the sensitive parts, obviously...)
What is your Snowflake account? Remember that the Snowflake "account" is everything preceding the ".snowflakecomputing.com" in your account URL. There should be both an account name and a region identifier - like "mySnowflakeAccount.us-east-1." for AWS East. Only Snowflake accounts based in the AWS US-West region will not have a region identifier included.
-Paul Horan-
Snowflake
The JDBC URL is below:
jdbc:snowflake://xxx.ap-southeast-2.snowflakecomputing.com:443/?warehouse=XXXXX_WH&db=XXXXXXXXX
I will assume you followed Snowflake’s post on the setup? Have you also reached out to their side for assistance?
https://support.snowflake.net/s/article/Integrating-Snowflake-and-Splunk-with-DBConnect
Thanks for reminding me! That is actually my next step as I am suspecting they could be blocking this type of connection from their end by default.
Snowflake will only "block" connection attempts with invalid credentials, or coming from network IP addresses outside your "whitelisted" network policy (should one have been created).
This is more likely an invalid connection string.
-Paul-
Thanks, @phoran ! When I tried to create such connection via Splunk DB Connect GUI, it did show "This is an invalid connection" when I was saving the settings. If I ask our Snowflake counterpart to add our IP to their whitelist, this should resolve the issue. Am I correct?
If you're trying to connect from outside the whitelisted IP range, the error message from Snowflake is "IP XX.XX.XX.XX is not allowed to access Snowflake. Contact your local security administrator."
The error you're seeing sounds more like a malformed connectionstring.
-Paul-
I see you have a support case open on this correct? Did you follow the guidance they suggested regarding the link and the timeout behavior?
Yes, I had captured tcpdump from Wireshark and will attach the output to the case for further analysis. I have also increased the timeout value to "maxWaitMillis=1800000" to test the connection again, but still no luck. Same error messages as the above internal logs. I am upgrading the jdbc driver to the latest to test again.
What is the version of the JDBC driver? Please provide the link to whichever one you downloaded. Thank you.
I am currently using snowflake-jdbc-3.12.4.jar driver downloaded from the site below:
https://repo1.maven.org/maven2/net/snowflake/snowflake-jdbc/
Are you using telnet to test from your client PC or from where DBX is installed? Just curious?
Telnet testing is from the Splunk HF where DBX is installed.
Hi there,
Please try the following steps:
jar
version and NOT javadoc.jar
:
https://docs.snowflake.net/manuals/user-guide/jdbc-download.html#downloading-the-driver
OR https://search.maven.org/classic/#search%7Cga%7C1%7Csnowflake-jdbc[snowflake]
displayName = Snowflake
serviceClass = com.splunk.dbx2.DefaultDBX2JDBC
jdbcDriverClass = net.snowflake.client.jdbc.SnowflakeDriver
jdbcUrlFormat = jdbc:snowflake://<host>:<port>/?db=<database>
ui_default_catalog = $database$
port = 443
jdbc:snowflake://<account_name>.snowflakecomputing.com/?<connection_params>
OR jdbc:snowflake://<account_name>.snowflakecomputing.com/?db=<database>&warehouse=<warehouse>&schema=<schema>
Replace all the <parameters>
to reflect your environment.Read Only
is unchecked as this parameter is not available on SnowflakeFetch size
can be left aloneYour Database Connection should look as follows: Screenshot
The JDBC URL can be adapted as per your environment as long as it matches their JDBC Driver Connection String: https://docs.snowflake.net/manuals/user-guide/jdbc-configure.html#jdbc-driver-connection-string!
Hi @ptang_splunk ,
Thanks for providing detailed steps above! I had configured our DB Connect inputs for Snowflake as described by your post as well as following the doc below from Snowflake:
https://community.snowflake.com/s/article/Integrating-Snowflake-and-Splunk-with-DBConnect
I am able to telnet to port 443 on the host as specified in the "JDBC URL" from the Splunk HF that the DB connection is initiated. However, I still encountered the below errors from internal logs (splunk_app_db_connect_server.log) after restarting splunkd service:
2020-05-30 00:03:19.220 +1000 [QuartzScheduler_Worker-10] WARN com.zaxxer.hikari.pool.PoolBase - unnamed_pool_1112188830_jdbc_snowflake//XXX.XX-XXXXXXXXX-X.snowflakecomputing.com443/?warehouseSPLUNK_WH&dbSNOWFLAKE - Failed to register management beans.
javax.management.RuntimeOperationsException: null
at com.sun.jmx.mbeanserver.Repository.addMBean(Unknown Source)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerWithRepository(Unknown Source)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerDynamicMBean(Unknown Source)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerObject(Unknown Source)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerMBean(Unknown Source)
at com.sun.jmx.mbeanserver.JmxMBeanServer.registerMBean(Unknown Source)
at com.zaxxer.hikari.pool.PoolBase.registerMBeans(PoolBase.java:258)
at com.zaxxer.hikari.pool.HikariPool.
at com.zaxxer.hikari.HikariDataSource.getConnection(HikariDataSource.java:94)
at com.splunk.dbx.connector.ConnectorFactory.getConnectionFromPool(ConnectorFactory.java:201)
at com.splunk.dbx.connector.ConnectorFactory.getConnection(ConnectorFactory.java:189)
at com.splunk.dbx.connector.ConnectorFactory.create(ConnectorFactory.java:174)
at com.splunk.dbx.server.dbinput.recordreader.DbInputRecordReader.executeQuery(DbInputRecordReader.java:60)
at com.splunk.dbx.server.dbinput.recordreader.DbInputRecordReader.open(DbInputRecordReader.java:52)
at org.easybatch.core.job.BatchJob.openReader(BatchJob.java:117)
at org.easybatch.core.job.BatchJob.call(BatchJob.java:74)
at org.easybatch.extensions.quartz.Job.execute(Job.java:59)
at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)
Caused by: java.lang.IllegalArgumentException: Repository: cannot add mbean for pattern name com.zaxxer.hikari:type=PoolConfig (unnamed_pool_1112188830_jdbc
... 19 common frames omitted
2020-05-30 00:03:21.109 +1000 [QuartzScheduler_Worker-10] ERROR org.easybatch.core.job.BatchJob - Unable to open record reader
java.sql.SQLFeatureNotSupportedException: null
at net.snowflake.client.jdbc.SnowflakeConnectionV1.setReadOnly(SnowflakeConnectionV1.java:518)
at com.zaxxer.hikari.pool.ProxyConnection.setReadOnly(ProxyConnection.java:388)
at com.zaxxer.hikari.pool.HikariProxyConnection.setReadOnly(HikariProxyConnection.java)
at com.splunk.dbx.connector.ConnectorFactory.getConnection(ConnectorFactory.java:190)
at com.splunk.dbx.connector.ConnectorFactory.create(ConnectorFactory.java:174)
at com.splunk.dbx.server.dbinput.recordreader.DbInputRecordReader.executeQuery(DbInputRecordReader.java:60)
at com.splunk.dbx.server.dbinput.recordreader.DbInputRecordReader.open(DbInputRecordReader.java:52)
at org.easybatch.core.job.BatchJob.openReader(BatchJob.java:117)
at org.easybatch.core.job.BatchJob.call(BatchJob.java:74)
at org.easybatch.extensions.quartz.Job.execute(Job.java:59)
at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)
2020-05-30 00:03:21.109 +1000 [QuartzScheduler_Worker-10] INFO org.easybatch.core.job.BatchJob - Job 'SNOWFLAKE_ACCOUNT_QUERY_TEST' finished with status: FAILED
As the above error trace did not give out much information on why the connection failed apart from the 2 messages below:
"Failed to register management beans"
"Unable to open record reader"