Context:
We have SPlunk ES setup on-prem.
We want to extract the required payloads through queries, generate scheduled reports (e.g., daily), and export these to a cloud location for ingestion by Snowflake.
Requirement:
1. Is there any way we can have API connection with Snowflake where it can call the API to extract specific logs from a specific index in SPlunk
2. If #1 is not possible, can we atleast run queries and send that report to a cloud repository for Snowflake to extract from.
TIA
Hi @SCK
I do not know much about Snowflake but it seems you might be able to create a User Defined Function (UDF) and then use Python to call the Splunk REST API to pull your data?
If this isnt an option then you might be able to achieve the same results by using something like Amazon S3 Sink Alert Action For Splunk to send your output from Splunk into S3 before then importing this in to Snowflake.
🌟 Did this answer help you? If so, please consider:
Your feedback encourages the volunteers in this community to continue contributing
@SCK
Snowflake calls Splunk API directly-Possible with Snowflake’s GetSplunk processor Reference-https://docs.snowflake.com/en/user-guide/data-integration/openflow/processors/getsplunk
Splunk exports reports to cloud repo-Schedule Splunk Searches/Reports and export the results. Configure Splunk to send the scheduled report output to a supported cloud storage using scripts (Python, Bash), Splunk alert actions...and ingest to Snowflake using external stages
Ref-https://estuary.dev/blog/snowflake-data-ingestion/#:~:text=The%20first%20step%20is%20to,stage%20(e.g.%2C%20CSV).
Regards,
Prewin
Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a kudos. Thanks!