Getting Data In

How to avoid monitored files being re-created often?


Ehh, I have an annoying case.

I'm monitoring a file over windows share (to make things even worse to troubleshoot is that I don't have direct access to the share from my administrative user; only the domain user the UF is running with has access).

The file is a CSV, it's getting properly split into fields, the date is getting parsed OK. I have transforms for removing the header (and a footer - this file has some footer as well). And this works mostly well.

Mostly, because every time there is data added to the file, the file is apparently getting recreated from scratch - new data is inserted before footer and I'm getting entries like

02-15-2022 10:55:23.008 +0100 INFO WatchedFile - File too small to check seekcrc, probably truncated. Will re-read entire file='\\path\to\the\file'

Luckily, for now the file is relatively small (some 3k lines) and doesn't eat up much license compared to this customer's other sources but it's annoying that the same events are getting ingested several times during the day.

The problem is that I don't see any reasonable way to avoid it. There is no deduplication functionality on input, I don't have any "buffer" I could compare it with using ingest-time eval or something like that.

Any aces up your sleeves? 😉

Labels (2)
0 Karma
Get Updates on the Splunk Community!

Accelerate Service Onboarding, Decomposition, Troubleshooting - and more with ITSI’s ...

Accelerate Service Onboarding, Decomposition, Troubleshooting - and more! Faster Time to ValueManaging and ...

New Release | Splunk Enterprise 9.3

Admins and Analyst can benefit from:  Seamlessly route data to your local file system to save on storage ...

2024 Splunk Career Impact Survey | Earn a $20 gift card for participating!

Hear ye, hear ye! The time has come again for Splunk's annual Career Impact Survey!  We need your help by ...