I would like to run a scheduled Splunk btool command using scripted input to index configs every few hours. I cannot put this command in .sh or any script file and give it as input to scripted input in Splunk due to limitation of running scripts on our Windows universal forwarders.
So, I have put path file under bin dir of app, and pointed the .path file in scripted input like
The path file contains the below command:
/opt/splunk/bin/splunk cmd btool inputs list —debug
But it is not running the Splunk btool cmd when pointed from the .path file. It's not indexing data.
The path file can only point and run external scripts in .sh or .exe formats.
Is there any possibility to run the btool command on UFs without using .exe scripts on Windows in scheduled based by Splunk inputs?
My requirement is to index config data every day
What's the purpose of running btool everyday? I see you want to index configuration data every day. Why not just use a REST command via the search?
REST API is definitely a good option, however not sure the coding required to pull back all attributes, and you do have to have network connectivity to attach to the Splunk management port which I find is often blocked in certain network zones.
Yeah, as in most cases, it depends. OP hasn't specified what conf files or from what components, so it's all speculation at this point
I want to run btool for audit configurations from all UF. Rest is no good option for is due to security reasons. We are looking to use some local inputs without executing .bat script on windows
If you cannot run the Splunk btool commands from the command line, I think you need to go with with the first suggestion of @skoelpin to use the REST API.
So why are you wanting to collect conf file information from all the forwarders? Are you trying to monitor who modified a configuration? If so, then you could use version control for this.
We are trying to monitor who accidently changed the configurations on windows UF.
May I know How to use version control for this
I think ingesting configuration files each day is a bad idea for this. It will also cost you money via the license cost to do this aswell. A better approach would be to use the deployment server to exclusively send configuration files to the forwarders and lock down that user of that config file on the host. You should then use BitBucket to version control your deployment server files that are sent to the hosts
I've done the same thing mostly for auditing our forwarder configs and making searchable in the Splunk UI.
I've created my own technical addons for forwarders to run btool as a scripted input which run a .sh script for Linux and and .bat file for Windows.
Linux - may work on other unix OSs as well