Hello Community, I'm relatively new to Splunk and now I'm struggling with a bigger challenge for me. The following scenario:
Configuration of a retention policy over 120 days. All directories that arrive in Frozen must be compressed and encrypted. Concerning encryption and compression I already have a little shellscript doing this job manually. Of course, this should be done automatically.
#! / Bin / bash
#Compresses the most recent directory and encrypts the transfer.tar.gz with gpg (testing-encryption-key).
newest = $ (ls -td / opt / splunk / var / lib / splunk / test / frozendb / * / | head -1)
echo $ newest
tar -czvf transfer.tar.gz $ newest
gpg -e -r testing-encryption-key -o / tmp / backup / $ (date +% Y% m% d-% H% M% S) .tar.gz.gpg transfer.tar.gz
rm transfer.tar.gz
sleep 2
----------------------------
The script works successfully if you do it manually.
Now there are two variants to bring data from cold to frozen. Either in the indexes.conf with:
coldToFrozenDir
or
coldToFrozenScript
When I use "ColdToFrozenDir", I need a triger around the script (compression and encryption). My idea would be a cronjob every 10 minutes, but here natülrich could forget a directory if there are overlaps. Is there any better idea here?
I think the better way would be to work with "coldToFrozenScript" and combine the copy process with my prof of concept script.
Do you have a good approach or a path for variant one? The whole of course should run stable 🙂 Thank you in advance