Hi Staff,
we have a distributed systems with 1 Splunk enterprise and N Heavy forwarder pushing data to it.
We would like to backup every night one .conf file inside the Heavy forwarder directly into a specific folder of the enterprise machine by using the same port 9997 or 8089 avoiding any other port configuration.
Is this possible?
How can we get the right solution?
Thanks in advance.
Nick
As @livehybrid mentioned, the general idea behind UFs and their management is the other way around - the deployment server distributes config items which are pulled and applied by the forwarders. This way you should not backup your confogs centrally from your forwarders but the completely opposite - have the config centrally distributed. This way if your forwarder fails or when you need to reinstall it you can simply push the same piece of config again.
I can understand that you might want to back up state of the forwarder (not the configs) so that if your forwarder breaks and you reinstall it you don't have to ingest all sources from scratch. But this is a fairly border case and there is no built-in mechanism for this. And there is definitely no built-in way to send the config or state of the forwarder to either the indexer(s) or the Deployment Server.
Since you can run almost anything as a scripted input you could try to do your own duct-tape bound solution which would gather state files from the forwarder (not sure about reading open files though and consistency of their contents) and - for example - compress such archive and base64 it so that it can be indexed as a text event but that's a horrible idea.
I dont think it's possible to directly back up a .conf file from a Heavy Forwarder to a specific folder on your Splunk Enterprise server using port 9997 or 8089.
Best method is to use standard file transfer methods(SCP,SFTP...)
Workaround is to create a script/input on the Heavy Forwarder that reads the contents of the .conf file and sends it as a log event to the Splunk Enterprise server. On the Enterprise server, you could then use a Splunk alert or script to reconstruct the file from the received event and save it.
Regards,
Prewin
Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
Hi @NickEot
Can I check - do you not deploy to your HF from a Deployment Server? Ideally HF would be relatively stateless and be rebuildable from a DS if it was to disappear and come back without configuration, however if you're not in a position to do this then you would need to look at a custom app to collect the data you need and index it.
I cant find it right now (but I'll keep looking) but I once created a simple app which ran btool and piped the output to a file. You could do something like this as a custom app which could then be configured to send the conf files you are interested in to a file which is configured to be monitored in inputs.conf and ingested to whichever index you need on your indexers.
🌟 Did this answer help you? If so, please consider:
Your feedback encourages the volunteers in this community to continue contributing