Splunk Enterprise Security

Splunk Snowflake Integration, but other was around

SCK
Loves-to-Learn

Context:
We have SPlunk ES setup on-prem.
We want to extract the required payloads through queries, generate scheduled reports (e.g., daily), and export these to a cloud location for ingestion by Snowflake.
Requirement:
1. Is there any way we can have API connection with Snowflake where it can call the API to extract specific logs from a specific index in SPlunk
2. If #1 is not possible, can we atleast run queries and send that report to a cloud repository for Snowflake to extract from.

 

TIA

Labels (1)
0 Karma

livehybrid
SplunkTrust
SplunkTrust

Hi @SCK 

I do not know much about Snowflake but it seems you might be able to create a User Defined Function (UDF) and then use Python to call the Splunk REST API to pull your data?

If this isnt an option then you might be able to achieve the same results by using something like Amazon S3 Sink Alert Action For Splunk to send your output from Splunk into S3 before then importing this in to Snowflake.

🌟 Did this answer help you? If so, please consider:

  • Adding karma to show it was useful
  • Marking it as the solution if it resolved your issue
  • Commenting if you need any clarification

Your feedback encourages the volunteers in this community to continue contributing

0 Karma

PrewinThomas
Motivator

@SCK 

Snowflake calls Splunk API directly-Possible with Snowflake’s GetSplunk processor Reference-https://docs.snowflake.com/en/user-guide/data-integration/openflow/processors/getsplunk

Splunk exports reports to cloud repo-Schedule Splunk Searches/Reports and export the results. Configure Splunk to send the scheduled report output to a supported cloud storage using scripts (Python, Bash), Splunk alert actions...and ingest to Snowflake using external stages

Ref-https://estuary.dev/blog/snowflake-data-ingestion/#:~:text=The%20first%20step%20is%20to,stage%20(e.g.%2C%20CSV).

Regards,
Prewin
Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a kudos. Thanks!

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.


Introducing Unified TDIR with the New Enterprise Security 8.2

Read the blog
Get Updates on the Splunk Community!

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Introducing Splunk 10.0: Smarter, Faster, and More Powerful Than Ever

Now On Demand Whether you're managing complex deployments or looking to future-proof your data ...

Community Content Calendar, September edition

Welcome to another insightful post from our Community Content Calendar! We're thrilled to continue bringing ...