Splunk Enterprise

script to rotate log files ?

SN1
Path Finder

I want to create a script for log rotation in splunk , which makes a zip file of last 3 days (individual zip files) ,so every new day it will make a new zip file and if the count of zip files is greater than 3 then it should delete oldest zip file so that count of zip file remains 3 only. because we only want zip file of last 3 days only. There is a main file which is storing logs everyday  (file name: firewall) and the script will be schedule like 1 am everyday.

Labels (1)
0 Karma

thahir
Contributor

@SN1 try the below log rotation script, modify the file log file name based on yours and setup the cron schedule based on your requirement.

#!/bin/bash
# Compress files >1 day old
find "<path of log file location> -iname "firewall-*.log" -type f -mtime +1 -exec gzip {} \;

# Delete .gz files >3 days old
find <path of log file location> -name "firewall-*.log.gz" -type f -mtime +3 -exec rm {} \;

 

let me know if you are facing any issues.

0 Karma

PickleRick
SplunkTrust
SplunkTrust

Please don't share "ready to use" scripts based on serious assumptions without at least explaining what those assumptions are.

In here your quite strong assumption is that there would be no files matching firewall-*.log.gz coming from other sources (possibly in other subdirectories.

Also - you're mixing -name with -iname.

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

This isn't really a Splunk question, it is a scripting question. Which scripting language do you want to use (there are many to choose from)?

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Index This | What travels the world but is also stuck in place?

April 2026 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Discover New Use Cases: Unlock Greater Value from Your Existing Splunk Data

Realizing the full potential of your Splunk investment requires more than just understanding current usage; it ...

Continue Your Journey: Join Session 2 of the Data Management and Federation Bootcamp ...

As data volumes continue to grow and environments become more distributed, managing and optimizing data ...