Archive

Forward data to third-party systems from splunk

New Member

Hi,

Splunk Version: 7.1.1

we are planning to send splunk existing data to third-party system called Champ. though i have gone through the below splunk Documentation, still i have many questions to proceed on this. can anyone please share your experience, I would greatly appreciate it if you kindly share your ideas.

Our main goal: is to forward existing data to third-party tool through splunk heavy forwarder.

Forward data to third-party systems: https://docs.splunk.com/Documentation/Splunk/7.3.0/Forwarding/Forwarddatatothird-partysystemsd

Still I have the below question:

Splunk forwarders can forward raw data to non-Splunk systems over a plain TCP socket : Splunk Forwarder to Champ

Q: Here do we have to send existing data to Heavy Forwarder OR sent data freshly to Heavy Forwarder??
To send existing data to HF, What configurations do we need to write? Where should we write? I mean is it through Deployment Server, Cluster Master, Heavy Forwarder?

By editing outputs.conf, props.conf, and transforms.conf, you can configure a heavy forwarder to route data conditionally to third-party systems

Q: Is it in Heavy Forwarder? I mean configuration edit

As per my understanding from documentation, we should edit the outputs.conf, to specify receiving Host and port. We will get the third party system Host and port.

Q: Where exactly we should edit the outputs.conf? I mean is it through Deployment Server, Cluster Master, Heavy Forwarder??

Q: Vice versa, do we need to provide any splunk server details to third party system user to configure anything from their end?

We should:

Edit props.conf to determine what data to route.

Q:How to determine what data needs to route?

Edit transforms.conf to determine where to route the data based on what you configured in props.conf.

0 Karma

Champion

just a comment it is far easier to ingest data through the splunk rest api services.
doc is here - https://docs.splunk.com/Documentation/Splunk/7.3.0/RESTTUT/RESTsearches
You can execute a simple python or curl based batch
for example i have a static csv updated manually , from which i just run a .bat file every day and export the results as a api

\etc\apps\search\bin>curl -ku admin:admin https://10.199.90.48:8089/servicesNS/admin/search/search/jobs/export -d search="search index="inctsk" |table Incident,Task,Title" -d outputmode=json

0 Karma