Hello to everyone.
I need to distribute a *.csv file that was created by a certain script (not with the help of Splunk).
The script runs every day and may update the file.
How can I do it in the SHC?
I tried to push this file with the help of the Deployer, but the main problem with this approach is that a lookup file is only created if it does not exist on the SHC members. If I push it once, I can't update it.
I understand that I can develop an external script that will delete an old file on the SHC members and then push a new one with the help of the Deployer.
But maybe an easier way exists to resolve my case?
In your script, copy the file to SH1.yourdomain.com:/opt/splunk/var/run/splunk/lookup_tmp/ (C:\Program Files\Splunk\var\run\splunk\lookup_tmp\ on Windwos) on one of the SHs.
This puts it into the Splunk lookup staging directory.
Make sure to pick one of the search heads and not a domain alias for your environment. Knowing the exact SH that you copy the file to is important for the next step.
Then you can use the REST API to promote it to the production version of the lookup using the below endpoint:
curl -k -u admin:pass https://SH1.yourdomain.com:8089/servicesNS/<user>/<app>/data/lookup-table-files/lookup_file_name.csv -d eai:data=/opt/splunk/var/run/splunk/lookup_tmp/new_lookup_file_name.csv
If the lookup is to be shared in the app, you can set the user to nobody.
This API call checks the staging area for a CSV called new_lookup_file_name.csv and overwrites the lookup_file_name.csv in production.
Since its elevated via the Splunk API, Splunk takes care of the replication to other SHs in the cluster.
Documentation on this can be found here: Knowledge endpoint descriptions - Splunk Documentation
Thank you for the workaround
I will check out this approach
Up
Up