Splunk Cloud Platform

How to get logs from snowflake into Splunk

Splunkerninja
Path Finder

Hello,

I have requirement to get logs into Splunk from Snowflake. I have no idea where to start from.

I came across Splunk docs using Splunk DB connect.https://docs.splunk.com/Documentation/DBX/3.15.0/DeployDBX/Installdatabasedrivers 

Can you'll guide me how do i get started here? How do I get logs from Snowflake into Splunk. Can I use HEC token to- get the logs?

Labels (2)
Tags (3)
0 Karma
1 Solution

Richfez
SplunkTrust
SplunkTrust

I found this -

https://community.snowflake.com/s/article/Integrating-Snowflake-and-Splunk-with-DBConnect

It looks like it walks you through exactly how to do that.

If that helps, karma's always appreciated!

Happy Splunking Snowflake data!

-Rich

View solution in original post

0 Karma

Richfez
SplunkTrust
SplunkTrust

I found this -

https://community.snowflake.com/s/article/Integrating-Snowflake-and-Splunk-with-DBConnect

It looks like it walks you through exactly how to do that.

If that helps, karma's always appreciated!

Happy Splunking Snowflake data!

-Rich

0 Karma

trobknight7
Engager

Hey @Richfez or @Splunkerninja ,

I've successfully ingested the Snowflake LOGIN_HISTORY and SESSIONS tables but I'm running into roadblock after roadblock with the ACCESS_HISTORY and QUERY_HISTORY table ingestions.

https://docs.snowflake.com/en/sql-reference/account-usage/access_history
https://docs.snowflake.com/en/sql-reference/account-usage/query_history

These tables have a QUERY_ID field that looks like this: 

a0fda135-d678-4184-942b-c3411ae8d1ce

And a QUERY_START_TIME (TIMESTAMP_LTZ) field that looks like this:
2022-01-25 16:17:47.388 +0000

The Checkpoint value system for a Rising ingestion in the Splunk DB Connect app doesn't play nicely with either of these fields and I've tried creating temporary fields and tables to bypass this issues but not to avail.

For the QUERY_ID, I tried removing the hyphens and replace the letters a,b,c,d,e,f with numeric values 1,2,3,4,5,6 and stored it in a different field called QUERY_ID_NUMERIC. When trying that out as the checkpoint value, the checkpoint never gets updated so it just ingests the same data over and over again.

Similarly for the QUERY_START_TIME, I've tried casting the TIMESTAMP_LTZ to TIMESTAMP_NTZ and saved that as a new field QUERY_START_TIME_NTZ and that ingests the data but the checkpoint value isn't updating either.

I was wondering if anyone has experienced this issue when ingesting these two data sources and if they've found any work arounds that resolved it!

Thank you very much!

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Splunk AI Assistant for SPL vs. ChatGPT: Which One is Better?

In the age of AI, every tool promises to make our lives easier. From summarizing content to writing code, ...

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Thanks for the Memories! Splunk University, .conf25, and our Community

Thank you to everyone in the Splunk Community who joined us for .conf25, which kicked off with our iconic ...