Splunk Cloud Platform

Best Practices for Streaming Logs from Splunk Cloud to External Platforms

NavS
Engager

Hi Splunk Community,

I need advice on the best approach for streaming logs from Splunk Cloud Platform to an external platform. The logs are already being ingested into Splunk Cloud from various applications used by my client's organization. Now, the requirement is to forward or stream these logs to an external system for additional processing and analytics.

#Splunk cloud

Thank you 

Nav

Labels (2)
0 Karma
1 Solution

tscroggins
Influencer

Hi @NavS,

Refer to https://docs.splunk.com/Documentation/SplunkCloud/latest/Service/SplunkCloudservice for supported data egress methods:

Data EgressDynamic Data Self-Storage export of aged data per index from Splunk Cloud Platform to Amazon S3 or Google Cloud StorageNo limit to the amount of data that can be exported from your indexes to your Amazon S3 or Google Cloud Storage account in the same region.Dynamic Data Self-Storage is designed to export 1 TB of data per hour.
Data EgressSearch results via UI or REST APIRecommend no more than 10% of ingested dataFor optimal performance, no single query, or all queries in aggregate over the day from the UI or REST API, should return full results of more than 10% of ingested daily volume. To route data to multiple locations, consider solutions like Ingest Actions, Ingest Processor, or the Edge Processor solution.
Data EgressSearch results to Splunk User Behavior Analytics (UBA)No limitData as a result of search queries to feed into Splunk User Behavior Analytics (UBA).

To stream events to both Splunk Cloud and another destination, an intermediate forwarding solution is required.

You should contact your client's Splunk account team for confirmation, but your Splunk Cloud native options are likely limited to the table above.

View solution in original post

NavS
Engager

Thank you @tscroggins 

tscroggins
Influencer

Hi @NavS,

Refer to https://docs.splunk.com/Documentation/SplunkCloud/latest/Service/SplunkCloudservice for supported data egress methods:

Data EgressDynamic Data Self-Storage export of aged data per index from Splunk Cloud Platform to Amazon S3 or Google Cloud StorageNo limit to the amount of data that can be exported from your indexes to your Amazon S3 or Google Cloud Storage account in the same region.Dynamic Data Self-Storage is designed to export 1 TB of data per hour.
Data EgressSearch results via UI or REST APIRecommend no more than 10% of ingested dataFor optimal performance, no single query, or all queries in aggregate over the day from the UI or REST API, should return full results of more than 10% of ingested daily volume. To route data to multiple locations, consider solutions like Ingest Actions, Ingest Processor, or the Edge Processor solution.
Data EgressSearch results to Splunk User Behavior Analytics (UBA)No limitData as a result of search queries to feed into Splunk User Behavior Analytics (UBA).

To stream events to both Splunk Cloud and another destination, an intermediate forwarding solution is required.

You should contact your client's Splunk account team for confirmation, but your Splunk Cloud native options are likely limited to the table above.

Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

.conf25 Global Broadcast: Don’t Miss a Moment

Hello Splunkers, .conf25 is only a click away.  Not able to make it to .conf25 in person? No worries, you can ...

Observe and Secure All Apps with Splunk

 Join Us for Our Next Tech Talk: Observe and Secure All Apps with SplunkAs organizations continue to innovate ...

What's New in Splunk Observability - August 2025

What's New We are excited to announce the latest enhancements to Splunk Observability Cloud as well as what is ...