Splunk Cloud Platform

How to get logs from snowflake into Splunk

Splunkerninja
Path Finder

Hello,

I have requirement to get logs into Splunk from Snowflake. I have no idea where to start from.

I came across Splunk docs using Splunk DB connect.https://docs.splunk.com/Documentation/DBX/3.15.0/DeployDBX/Installdatabasedrivers 

Can you'll guide me how do i get started here? How do I get logs from Snowflake into Splunk. Can I use HEC token to- get the logs?

Labels (2)
Tags (3)
0 Karma
1 Solution

Richfez
SplunkTrust
SplunkTrust

I found this -

https://community.snowflake.com/s/article/Integrating-Snowflake-and-Splunk-with-DBConnect

It looks like it walks you through exactly how to do that.

If that helps, karma's always appreciated!

Happy Splunking Snowflake data!

-Rich

View solution in original post

0 Karma

Richfez
SplunkTrust
SplunkTrust

I found this -

https://community.snowflake.com/s/article/Integrating-Snowflake-and-Splunk-with-DBConnect

It looks like it walks you through exactly how to do that.

If that helps, karma's always appreciated!

Happy Splunking Snowflake data!

-Rich

0 Karma

trobknight7
Engager

Hey @Richfez or @Splunkerninja ,

I've successfully ingested the Snowflake LOGIN_HISTORY and SESSIONS tables but I'm running into roadblock after roadblock with the ACCESS_HISTORY and QUERY_HISTORY table ingestions.

https://docs.snowflake.com/en/sql-reference/account-usage/access_history
https://docs.snowflake.com/en/sql-reference/account-usage/query_history

These tables have a QUERY_ID field that looks like this: 

a0fda135-d678-4184-942b-c3411ae8d1ce

And a QUERY_START_TIME (TIMESTAMP_LTZ) field that looks like this:
2022-01-25 16:17:47.388 +0000

The Checkpoint value system for a Rising ingestion in the Splunk DB Connect app doesn't play nicely with either of these fields and I've tried creating temporary fields and tables to bypass this issues but not to avail.

For the QUERY_ID, I tried removing the hyphens and replace the letters a,b,c,d,e,f with numeric values 1,2,3,4,5,6 and stored it in a different field called QUERY_ID_NUMERIC. When trying that out as the checkpoint value, the checkpoint never gets updated so it just ingests the same data over and over again.

Similarly for the QUERY_START_TIME, I've tried casting the TIMESTAMP_LTZ to TIMESTAMP_NTZ and saved that as a new field QUERY_START_TIME_NTZ and that ingests the data but the checkpoint value isn't updating either.

I was wondering if anyone has experienced this issue when ingesting these two data sources and if they've found any work arounds that resolved it!

Thank you very much!

0 Karma
Get Updates on the Splunk Community!

Building Reliable Asset and Identity Frameworks in Splunk ES

 Accurate asset and identity resolution is the backbone of security operations. Without it, alerts are ...

Cloud Monitoring Console - Unlocking Greater Visibility in SVC Usage Reporting

For Splunk Cloud customers, understanding and optimizing Splunk Virtual Compute (SVC) usage and resource ...

Automatic Discovery Part 3: Practical Use Cases

If you’ve enabled Automatic Discovery in your install of the Splunk Distribution of the OpenTelemetry ...