Splunk Enterprise

Best Practices for Forwarding Data from Splunk to Third-Party Systems

GHAITHQR
Loves-to-Learn Lots

Hi all,

I am currently facing an issue in my Splunk environment. We need to forward data from Splunk to a third-party system, specifically Elasticsearch.

For context, my setup consists of two indexers, one search head, and one deployment server. Could anyone share the best practices for achieving this?

I’d appreciate any guidance or recommendations to ensure a smooth and efficient setup.

Thanks in advance!

Labels (1)
0 Karma

GHAITHQR
Loves-to-Learn Lots

Thank you for your response.

We will need to forward both old and new data, and the process of forwarding will be continuous.

0 Karma

PickleRick
SplunkTrust
SplunkTrust

OK. You can't "forward" already existing data. You need to search the indexes, create a "dump" of the results and push them to the other solution.

For the continuously incoming data you can either use syslog output on your indexers or a s2s output to an external component (either a HF or a third-party solution which can talk s2s). The caveat here is that it introduces additional complexity and possible points of failure on your indexers.

If your architecture had a separate HF layer before indexers through which all input streams went you could do that on that HF layer instead of indexers.

General forwarding to an external system solution is tricky to do well. You might want to engage PS or your local Splunk Partner.

0 Karma

GHAITHQR
Loves-to-Learn Lots

Thank you for your detailed response!

If I were to implement a Heavy Forwarder (HF) in my architecture, would this be the correct approach? Additionally, would it be considered a best practice for forwarding data to an external system?

0 Karma

isoutamo
SplunkTrust
SplunkTrust
One option could be use of separate data collector agent like filebeat instead of try to duplicate data somewhere in middle of path to splunk? But as said you should use some person to look and understand your situation and then make the plan, what is best for you.
0 Karma

PickleRick
SplunkTrust
SplunkTrust

"Best practice" depends heavily on use case. There are some general best practices but they might not be suited well to a particular situation at hand. That's why I suggest involving a skilled professional who will review your detailed requirements and suggest appropriate solution.

0 Karma

isoutamo
SplunkTrust
SplunkTrust
How much data you have?
Only one/some indexes or all?
0 Karma

GHAITHQR
Loves-to-Learn Lots

I apologize for the mistake in my previous reply about forwarding data. To clarify, the data to be forwarded will be new data only.

Regarding your question, could you clarify what you mean by "how much data"? Are you asking about the data volume per day or the total size of all data?

The data to be forwarded comes from two indexers, and it includes all indexes.

0 Karma

isoutamo
SplunkTrust
SplunkTrust
Do you need forward a new data only or also already indexed data?

If new one then you can e.g. use own filebeat etc. to collect same input or configure your current UF to forward two places.
If the need contains already indexed data then based on amount of it and is this a one shot or continuous need make some guidance how to do it.
0 Karma
Get Updates on the Splunk Community!

Infographic provides the TL;DR for the 2024 Splunk Career Impact Report

We’ve been buzzing with excitement about the recent validation of Splunk Education! The 2024 Splunk Career ...

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...