Deployment Architecture

Is it okay to manually compress rawdata files?

Lowell
Super Champion

I noticed that not all of my files in the rawdata directory have been compressed into .gz files. While this doesn't seem to be cause problem searching or anything, but it does take up quite a bit more disks space. The non-compressed files are up to 10Mb, but once they are compressed they are often under 1 Mb, which makes a big difference.

I asked about this a while back on the Splunk forum, and I think I discussed this with splunk support over the phone at one point, but I was looking for confirmation that it's okay to compress these myself, or find out if there is a better way.

I recently notice a file in the rawdata directory called .compressionManifest. This seems to contain a list of ids which correspond to files in the rawdata folder. Does this get updated automatically, or if I manually run gzip do I need to update this myself?


Just as a reference, I've been running the following command daily (cron job), for some time now.

find /opt/splunk/var/lib/splunk/ -type f -path '*/db/db_[0-9]*_[0-9]*_[0-9]*/rawdata/[0-9]*' \! -name '*.gz' -size +1k -print0 | xargs -0 -r gzip -9

Is there any problem with doing this?

Lowell
Super Champion

Yes this is fine, but this isn't relevant with modern version of Splunk...

0 Karma
Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...