Deployment Architecture

Is it okay to manually compress rawdata files?

Lowell
Super Champion

I noticed that not all of my files in the rawdata directory have been compressed into .gz files. While this doesn't seem to be cause problem searching or anything, but it does take up quite a bit more disks space. The non-compressed files are up to 10Mb, but once they are compressed they are often under 1 Mb, which makes a big difference.

I asked about this a while back on the Splunk forum, and I think I discussed this with splunk support over the phone at one point, but I was looking for confirmation that it's okay to compress these myself, or find out if there is a better way.

I recently notice a file in the rawdata directory called .compressionManifest. This seems to contain a list of ids which correspond to files in the rawdata folder. Does this get updated automatically, or if I manually run gzip do I need to update this myself?


Just as a reference, I've been running the following command daily (cron job), for some time now.

find /opt/splunk/var/lib/splunk/ -type f -path '*/db/db_[0-9]*_[0-9]*_[0-9]*/rawdata/[0-9]*' \! -name '*.gz' -size +1k -print0 | xargs -0 -r gzip -9

Is there any problem with doing this?

Lowell
Super Champion

Yes this is fine, but this isn't relevant with modern version of Splunk...

0 Karma
.conf21 Now Fully Virtual!
Register for FREE Today!

We've made .conf21 totally virtual and totally FREE! Our completely online experience will run from 10/19 through 10/20 with some additional events, too!