Splunk Cloud Platform

Best Practices for Streaming Logs from Splunk Cloud to External Platforms

NavS
Engager

Hi Splunk Community,

I need advice on the best approach for streaming logs from Splunk Cloud Platform to an external platform. The logs are already being ingested into Splunk Cloud from various applications used by my client's organization. Now, the requirement is to forward or stream these logs to an external system for additional processing and analytics.

#Splunk cloud

Thank you 

Nav

Labels (2)
0 Karma
1 Solution

tscroggins
Influencer

Hi @NavS,

Refer to https://docs.splunk.com/Documentation/SplunkCloud/latest/Service/SplunkCloudservice for supported data egress methods:

Data EgressDynamic Data Self-Storage export of aged data per index from Splunk Cloud Platform to Amazon S3 or Google Cloud StorageNo limit to the amount of data that can be exported from your indexes to your Amazon S3 or Google Cloud Storage account in the same region.Dynamic Data Self-Storage is designed to export 1 TB of data per hour.
Data EgressSearch results via UI or REST APIRecommend no more than 10% of ingested dataFor optimal performance, no single query, or all queries in aggregate over the day from the UI or REST API, should return full results of more than 10% of ingested daily volume. To route data to multiple locations, consider solutions like Ingest Actions, Ingest Processor, or the Edge Processor solution.
Data EgressSearch results to Splunk User Behavior Analytics (UBA)No limitData as a result of search queries to feed into Splunk User Behavior Analytics (UBA).

To stream events to both Splunk Cloud and another destination, an intermediate forwarding solution is required.

You should contact your client's Splunk account team for confirmation, but your Splunk Cloud native options are likely limited to the table above.

View solution in original post

NavS
Engager

Thank you @tscroggins 

tscroggins
Influencer

Hi @NavS,

Refer to https://docs.splunk.com/Documentation/SplunkCloud/latest/Service/SplunkCloudservice for supported data egress methods:

Data EgressDynamic Data Self-Storage export of aged data per index from Splunk Cloud Platform to Amazon S3 or Google Cloud StorageNo limit to the amount of data that can be exported from your indexes to your Amazon S3 or Google Cloud Storage account in the same region.Dynamic Data Self-Storage is designed to export 1 TB of data per hour.
Data EgressSearch results via UI or REST APIRecommend no more than 10% of ingested dataFor optimal performance, no single query, or all queries in aggregate over the day from the UI or REST API, should return full results of more than 10% of ingested daily volume. To route data to multiple locations, consider solutions like Ingest Actions, Ingest Processor, or the Edge Processor solution.
Data EgressSearch results to Splunk User Behavior Analytics (UBA)No limitData as a result of search queries to feed into Splunk User Behavior Analytics (UBA).

To stream events to both Splunk Cloud and another destination, an intermediate forwarding solution is required.

You should contact your client's Splunk account team for confirmation, but your Splunk Cloud native options are likely limited to the table above.

Get Updates on the Splunk Community!

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

Get Inspired! We’ve Got Validation that Your Hard Work is Paying Off

We love our Splunk Community and want you to feel inspired by all your hard work! Eric Fusilero, our VP of ...

What's New in Splunk Enterprise 9.4: Features to Power Your Digital Resilience

Hey Splunky People! We are excited to share the latest updates in Splunk Enterprise 9.4. In this release we ...