Deployment Architecture

Retention Policy - FrozenDB Data compress and encrypt

hiph151
Explorer

Hello Community, I'm relatively new to Splunk and now I'm struggling with a bigger challenge for me. The following scenario:

Configuration of a retention policy over 120 days. All directories that arrive in Frozen must be compressed and encrypted. Concerning encryption and compression I already have a little shellscript doing this job manually. Of course, this should be done automatically.

prof of concept - script:

#! / Bin / bash
#Compresses the most recent directory and encrypts the transfer.tar.gz with gpg (testing-encryption-key).

newest = $ (ls -td / opt / splunk / var / lib / splunk / test / frozendb / * / | head -1)
echo $ newest

tar -czvf transfer.tar.gz $ newest
 
gpg -e -r testing-encryption-key -o / tmp / backup / $ (date +% Y% m% d-% H% M% S) .tar.gz.gpg transfer.tar.gz

rm transfer.tar.gz
sleep 2
----------------------------

The script works successfully if you do it manually.

Now there are two variants to bring data from cold to frozen. Either in the indexes.conf with:

coldToFrozenDir
or
coldToFrozenScript

When I use "ColdToFrozenDir", I need a triger around the script (compression and encryption). My idea would be a cronjob every 10 minutes, but here natülrich could forget a directory if there are overlaps. Is there any better idea here?

I think the better way would be to work with "coldToFrozenScript" and combine the copy process with my prof of concept script.

Do you have a good approach or a path for variant one? The whole of course should run stable 🙂 Thank you in advance

Tags (1)
0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...