Getting Data In

How do I ingest rotated log files without the source filename changing?

marrette
Path Finder

I have several logs files on several hosts which ingest data from log files which are quite high volume (nearly as high as 2gb/hour on a big day) and are rotated every hour. On rotation the file name will change from Device_01.log to Device_01.log.2018-10-08-12

The inputs.conf stanza is configured like so:

[monitor:///path/to/logs/...]
disabled = false
followTail = 0
index = myIndex
whitelist =  .*\.log$|.*\.log\.\d\d\d\d-\d\d-\d\d-\d\d$
ignoreOlderThan = 1d
blacklist = .*ffdc_.*log|messages_.*log|exception_.*log|trace.*log|native*.log|activity.log|systemout*[0-9].log|systemout_.*log|.*-metrics\.[0-9]{1,3}\.log|\d+.\d+.\d+.\d+.\d+.\d+.log|-\d+-\d+-\d+.log
sourcetype = myApplication

And this works - if Splunk isn't able to keep up with the data coming in before the file is rotated on the hour, it will open the renamed file and read the rest of the data from where it left off. But the source field of the renaming data will be the renamed file, not the actually log file name. So the following Splunk query will show the whole file if a wildcard is used:

host=AppServer* index=myIndex source=/path/to/logs/Device_01.log*

...but it would be nice if Splunk would keep the source field named with the original file name, not the rolled filename.

Is this possible to do?

Thanks
Eddie

0 Karma
1 Solution

FrankVl
Ultra Champion

You could create a props and transforms config to overwrite the value of the source field. Try something like below (regex might require some tweaking). Deploy this on your indexer(s) (or on your heavy forwarder if you use one for this data).

props.conf

[myApplication]
TRANSFORMS-setsource = myApp-setsource

transforms.conf

[myApp-setsource]
SOURCE_KEY = MetaData:Source
REGEX = (^[^\.]+\.log).*
DEST_KEY = MetaData:Source
FORMAT = source::$1

https://regex101.com/r/lcKD4Y/1

View solution in original post

mstjohn_splunk
Splunk Employee
Splunk Employee

hi @marrette,

Did the answer below solve your problem? If so, please resolve this post by approving it! If your problem is still not solved, keep us updated so that someone else can help ya.

Thanks for posting!

0 Karma

FrankVl
Ultra Champion

You could create a props and transforms config to overwrite the value of the source field. Try something like below (regex might require some tweaking). Deploy this on your indexer(s) (or on your heavy forwarder if you use one for this data).

props.conf

[myApplication]
TRANSFORMS-setsource = myApp-setsource

transforms.conf

[myApp-setsource]
SOURCE_KEY = MetaData:Source
REGEX = (^[^\.]+\.log).*
DEST_KEY = MetaData:Source
FORMAT = source::$1

https://regex101.com/r/lcKD4Y/1

Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Matching cron expressions

This puzzle (first published here) is based on matching timestamps to cron expressions.All the timestamps ...

Design, Compete, Win: Submit Your Best Splunk Dashboards for a .conf26 Pass

Hello Splunkers,  We’re excited to kick off a Splunk Dashboard contest! We know that dashboards are a primary ...

May 2026 Splunk Expert Sessions: Security & Observability

Level Up Your Operations: May 2026 Splunk Expert Sessions Whether you are refining your security posture or ...