Splunk Cloud Platform

Best Practices for Streaming Logs from Splunk Cloud to External Platforms

NavS
Engager

Hi Splunk Community,

I need advice on the best approach for streaming logs from Splunk Cloud Platform to an external platform. The logs are already being ingested into Splunk Cloud from various applications used by my client's organization. Now, the requirement is to forward or stream these logs to an external system for additional processing and analytics.

#Splunk cloud

Thank you 

Nav

Labels (2)
0 Karma
1 Solution

tscroggins
Influencer

Hi @NavS,

Refer to https://docs.splunk.com/Documentation/SplunkCloud/latest/Service/SplunkCloudservice for supported data egress methods:

Data EgressDynamic Data Self-Storage export of aged data per index from Splunk Cloud Platform to Amazon S3 or Google Cloud StorageNo limit to the amount of data that can be exported from your indexes to your Amazon S3 or Google Cloud Storage account in the same region.Dynamic Data Self-Storage is designed to export 1 TB of data per hour.
Data EgressSearch results via UI or REST APIRecommend no more than 10% of ingested dataFor optimal performance, no single query, or all queries in aggregate over the day from the UI or REST API, should return full results of more than 10% of ingested daily volume. To route data to multiple locations, consider solutions like Ingest Actions, Ingest Processor, or the Edge Processor solution.
Data EgressSearch results to Splunk User Behavior Analytics (UBA)No limitData as a result of search queries to feed into Splunk User Behavior Analytics (UBA).

To stream events to both Splunk Cloud and another destination, an intermediate forwarding solution is required.

You should contact your client's Splunk account team for confirmation, but your Splunk Cloud native options are likely limited to the table above.

View solution in original post

NavS
Engager

Thank you @tscroggins 

tscroggins
Influencer

Hi @NavS,

Refer to https://docs.splunk.com/Documentation/SplunkCloud/latest/Service/SplunkCloudservice for supported data egress methods:

Data EgressDynamic Data Self-Storage export of aged data per index from Splunk Cloud Platform to Amazon S3 or Google Cloud StorageNo limit to the amount of data that can be exported from your indexes to your Amazon S3 or Google Cloud Storage account in the same region.Dynamic Data Self-Storage is designed to export 1 TB of data per hour.
Data EgressSearch results via UI or REST APIRecommend no more than 10% of ingested dataFor optimal performance, no single query, or all queries in aggregate over the day from the UI or REST API, should return full results of more than 10% of ingested daily volume. To route data to multiple locations, consider solutions like Ingest Actions, Ingest Processor, or the Edge Processor solution.
Data EgressSearch results to Splunk User Behavior Analytics (UBA)No limitData as a result of search queries to feed into Splunk User Behavior Analytics (UBA).

To stream events to both Splunk Cloud and another destination, an intermediate forwarding solution is required.

You should contact your client's Splunk account team for confirmation, but your Splunk Cloud native options are likely limited to the table above.

Get Updates on the Splunk Community!

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...

Cloud Platform & Enterprise: Classic Dashboard Export Feature Deprecation

As of Splunk Cloud Platform 9.3.2408 and Splunk Enterprise 9.4, classic dashboard export features are now ...