Splunk Cloud Platform

Best Practices for Streaming Logs from Splunk Cloud to External Platforms

NavS
Engager

Hi Splunk Community,

I need advice on the best approach for streaming logs from Splunk Cloud Platform to an external platform. The logs are already being ingested into Splunk Cloud from various applications used by my client's organization. Now, the requirement is to forward or stream these logs to an external system for additional processing and analytics.

#Splunk cloud

Thank you 

Nav

Labels (2)
0 Karma
1 Solution

tscroggins
Influencer

Hi @NavS,

Refer to https://docs.splunk.com/Documentation/SplunkCloud/latest/Service/SplunkCloudservice for supported data egress methods:

Data EgressDynamic Data Self-Storage export of aged data per index from Splunk Cloud Platform to Amazon S3 or Google Cloud StorageNo limit to the amount of data that can be exported from your indexes to your Amazon S3 or Google Cloud Storage account in the same region.Dynamic Data Self-Storage is designed to export 1 TB of data per hour.
Data EgressSearch results via UI or REST APIRecommend no more than 10% of ingested dataFor optimal performance, no single query, or all queries in aggregate over the day from the UI or REST API, should return full results of more than 10% of ingested daily volume. To route data to multiple locations, consider solutions like Ingest Actions, Ingest Processor, or the Edge Processor solution.
Data EgressSearch results to Splunk User Behavior Analytics (UBA)No limitData as a result of search queries to feed into Splunk User Behavior Analytics (UBA).

To stream events to both Splunk Cloud and another destination, an intermediate forwarding solution is required.

You should contact your client's Splunk account team for confirmation, but your Splunk Cloud native options are likely limited to the table above.

View solution in original post

NavS
Engager

Thank you @tscroggins 

tscroggins
Influencer

Hi @NavS,

Refer to https://docs.splunk.com/Documentation/SplunkCloud/latest/Service/SplunkCloudservice for supported data egress methods:

Data EgressDynamic Data Self-Storage export of aged data per index from Splunk Cloud Platform to Amazon S3 or Google Cloud StorageNo limit to the amount of data that can be exported from your indexes to your Amazon S3 or Google Cloud Storage account in the same region.Dynamic Data Self-Storage is designed to export 1 TB of data per hour.
Data EgressSearch results via UI or REST APIRecommend no more than 10% of ingested dataFor optimal performance, no single query, or all queries in aggregate over the day from the UI or REST API, should return full results of more than 10% of ingested daily volume. To route data to multiple locations, consider solutions like Ingest Actions, Ingest Processor, or the Edge Processor solution.
Data EgressSearch results to Splunk User Behavior Analytics (UBA)No limitData as a result of search queries to feed into Splunk User Behavior Analytics (UBA).

To stream events to both Splunk Cloud and another destination, an intermediate forwarding solution is required.

You should contact your client's Splunk account team for confirmation, but your Splunk Cloud native options are likely limited to the table above.

Get Updates on the Splunk Community!

New This Month in Splunk Observability Cloud - Metrics Usage Analytics, Enhanced K8s ...

The latest enhancements across the Splunk Observability portfolio deliver greater flexibility, better data and ...

Alerting Best Practices: How to Create Good Detectors

At their best, detectors and the alerts they trigger notify teams when applications aren’t performing as ...

Discover Powerful New Features in Splunk Cloud Platform: Enhanced Analytics, ...

Hey Splunky people! We are excited to share the latest updates in Splunk Cloud Platform 9.3.2408. In this ...