- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Best Practices for Forwarding Data from Splunk to Third-Party Systems
Hi all,
I am currently facing an issue in my Splunk environment. We need to forward data from Splunk to a third-party system, specifically Elasticsearch.
For context, my setup consists of two indexers, one search head, and one deployment server. Could anyone share the best practices for achieving this?
I’d appreciate any guidance or recommendations to ensure a smooth and efficient setup.
Thanks in advance!
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thank you for your response.
We will need to forward both old and new data, and the process of forwarding will be continuous.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

OK. You can't "forward" already existing data. You need to search the indexes, create a "dump" of the results and push them to the other solution.
For the continuously incoming data you can either use syslog output on your indexers or a s2s output to an external component (either a HF or a third-party solution which can talk s2s). The caveat here is that it introduces additional complexity and possible points of failure on your indexers.
If your architecture had a separate HF layer before indexers through which all input streams went you could do that on that HF layer instead of indexers.
General forwarding to an external system solution is tricky to do well. You might want to engage PS or your local Splunk Partner.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thank you for your detailed response!
If I were to implement a Heavy Forwarder (HF) in my architecture, would this be the correct approach? Additionally, would it be considered a best practice for forwarding data to an external system?
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

"Best practice" depends heavily on use case. There are some general best practices but they might not be suited well to a particular situation at hand. That's why I suggest involving a skilled professional who will review your detailed requirements and suggest appropriate solution.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Only one/some indexes or all?
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I apologize for the mistake in my previous reply about forwarding data. To clarify, the data to be forwarded will be new data only.
Regarding your question, could you clarify what you mean by "how much data"? Are you asking about the data volume per day or the total size of all data?
The data to be forwarded comes from two indexers, and it includes all indexes.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

If new one then you can e.g. use own filebeat etc. to collect same input or configure your current UF to forward two places.
If the need contains already indexed data then based on amount of it and is this a one shot or continuous need make some guidance how to do it.
