Getting Data In

logrotate (sort of) file monitoring issue in Splunk LWF 6.0.2


we have multiple files that are being monitored ; file.1, file.2, file.3 Bob.1, Bob.2, Bob.3, Cat.1 Cat.2, Cat3. ...etc

for file.1, all new events will be updated only in file.1 and once it reach a certain quota, it will start updating only file.2 and then only file.3. once file.3 is full and it will start with file.1 again. So are the same for Bob and Cat logs.

in my inputs



index = ABC

recursive = false

ignoreOlderThan = 14d

disabled = false

crcSalt = "<"SOURCE">"

in props


sourcetype = ABC_file

CHECK_METHOD = entire_md5


sourcetype = ABC_Bob

CHECK_METHOD = entire_md5


sourcetype = ABC_Cat

CHECK_METHOD = entire_md5

and this works great no issue here

The issue is every time there is file change, for example. file.1 being full and events starts coming in to file.2 , Splunk cannot seems to detect it? so are for Bob and Cat file. I have to restart splunk and it will start monitoring file.2 for a while until it move to file.3? am i missing something here?

Tags (2)
0 Karma


I believe that you should remove the ignoreOlderThan = 14d

Unless the files are all the same at the beginning - you should also remove the crcSalt = "<"SOURCE">"


thanks. once the files are deemed to be older than 14 days it actually get store in the fishbucket and will never be monitored ever again though the files are updated later on and modtime change etc. So yes you are right, that's the problem. thanks for the pointers!

0 Karma
Get Updates on the Splunk Community!

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...

Last Chance to Submit Your Paper For BSides Splunk - Deadline is August 12th!

Hello everyone! Don't wait to submit - The deadline is August 12th! We have truly missed the community so ...