We are trying to monitor a logfile which behaves like a rolling logfile (?). Except, it doesn't create new file but it keeps updating the existing file. A new line will be added above the "-----". And the row below it will be deleted from the file. Until reaching the end of the file, and then starts from the top again. (file is 1000 rows max.)
Thu Oct 3 10:22:00 2019 Example log Thu Oct 3 10:34:00 2019 Example log 2 ----- Wed Oct 2 06:19:00 2019 Old Example log which will be deleted upon new line Wed Oct 2 06:22:00 2019 Old Example 2
[monitor://c:\folder\log.txt] disabled=0 index=my_index sourcetype=my_sourcetype
[my_sourcetype] TIME_FORMAT=%a %B %d %H:%M:%S %Y SHOULD_LINEMERGE=false
When new lines are added halfway the files, it does not get indexed. Some lines are indexed multiple times (maybe because of the fact lines switch rownumber and Splunk sees it as a new event?).
How to monitor this file, and only add the new lines as event to Splunk?
Since we left the CHECK_METHOD default, we assumed only new rows will be added as event. But Splunk sees it differently.
what's the update frequency of your file?
maybe it's too quick and Splunk cannot reach to index it!
Because your behaviour it's really strange: Splunk usually indexes all new events and doesn't index old events.
Update frequency varies between a few seconds and a few minutes, even sometimes a couple of days. But that doesn't really reflect on the events I see in Splunk.
We have this logfile on multiple hosts, they all behave te same way. Except for new hosts, where the file was newly created. These events pop-up in Splunk just the way you would expect.