Splunk Cloud Platform

Best Practices for Streaming Logs from Splunk Cloud to External Platforms

NavS
Engager

Hi Splunk Community,

I need advice on the best approach for streaming logs from Splunk Cloud Platform to an external platform. The logs are already being ingested into Splunk Cloud from various applications used by my client's organization. Now, the requirement is to forward or stream these logs to an external system for additional processing and analytics.

#Splunk cloud

Thank you 

Nav

Labels (2)
0 Karma
1 Solution

tscroggins
Influencer

Hi @NavS,

Refer to https://docs.splunk.com/Documentation/SplunkCloud/latest/Service/SplunkCloudservice for supported data egress methods:

Data EgressDynamic Data Self-Storage export of aged data per index from Splunk Cloud Platform to Amazon S3 or Google Cloud StorageNo limit to the amount of data that can be exported from your indexes to your Amazon S3 or Google Cloud Storage account in the same region.Dynamic Data Self-Storage is designed to export 1 TB of data per hour.
Data EgressSearch results via UI or REST APIRecommend no more than 10% of ingested dataFor optimal performance, no single query, or all queries in aggregate over the day from the UI or REST API, should return full results of more than 10% of ingested daily volume. To route data to multiple locations, consider solutions like Ingest Actions, Ingest Processor, or the Edge Processor solution.
Data EgressSearch results to Splunk User Behavior Analytics (UBA)No limitData as a result of search queries to feed into Splunk User Behavior Analytics (UBA).

To stream events to both Splunk Cloud and another destination, an intermediate forwarding solution is required.

You should contact your client's Splunk account team for confirmation, but your Splunk Cloud native options are likely limited to the table above.

View solution in original post

NavS
Engager

Thank you @tscroggins 

tscroggins
Influencer

Hi @NavS,

Refer to https://docs.splunk.com/Documentation/SplunkCloud/latest/Service/SplunkCloudservice for supported data egress methods:

Data EgressDynamic Data Self-Storage export of aged data per index from Splunk Cloud Platform to Amazon S3 or Google Cloud StorageNo limit to the amount of data that can be exported from your indexes to your Amazon S3 or Google Cloud Storage account in the same region.Dynamic Data Self-Storage is designed to export 1 TB of data per hour.
Data EgressSearch results via UI or REST APIRecommend no more than 10% of ingested dataFor optimal performance, no single query, or all queries in aggregate over the day from the UI or REST API, should return full results of more than 10% of ingested daily volume. To route data to multiple locations, consider solutions like Ingest Actions, Ingest Processor, or the Edge Processor solution.
Data EgressSearch results to Splunk User Behavior Analytics (UBA)No limitData as a result of search queries to feed into Splunk User Behavior Analytics (UBA).

To stream events to both Splunk Cloud and another destination, an intermediate forwarding solution is required.

You should contact your client's Splunk account team for confirmation, but your Splunk Cloud native options are likely limited to the table above.

Get Updates on the Splunk Community!

Holistic Visibility and Effective Alerting Across IT and OT Assets

Instead of effective and unified solutions, they’re left with tool fatigue, disjointed alerts and siloed ...

SOC Modernization: How Automation and Splunk SOAR are Shaping the Next-Gen Security ...

Security automation is no longer a luxury but a necessity. Join us to learn how Splunk ES and SOAR empower ...

Ask It, Fix It: Faster Investigations with AI Assistant in Observability Cloud

  Join us in this Tech Talk and learn about the recently launched AI Assistant in Observability Cloud. With ...