Getting Data In

How to avoid monitored files being re-created often?

PickleRick
SplunkTrust
SplunkTrust

Ehh, I have an annoying case.

I'm monitoring a file over windows share (to make things even worse to troubleshoot is that I don't have direct access to the share from my administrative user; only the domain user the UF is running with has access).

The file is a CSV, it's getting properly split into fields, the date is getting parsed OK. I have transforms for removing the header (and a footer - this file has some footer as well). And this works mostly well.

Mostly, because every time there is data added to the file, the file is apparently getting recreated from scratch - new data is inserted before footer and I'm getting entries like

02-15-2022 10:55:23.008 +0100 INFO WatchedFile - File too small to check seekcrc, probably truncated. Will re-read entire file='\\path\to\the\file'

Luckily, for now the file is relatively small (some 3k lines) and doesn't eat up much license compared to this customer's other sources but it's annoying that the same events are getting ingested several times during the day.

The problem is that I don't see any reasonable way to avoid it. There is no deduplication functionality on input, I don't have any "buffer" I could compare it with using ingest-time eval or something like that.

Any aces up your sleeves? 😉

Labels (2)
0 Karma
Get Updates on the Splunk Community!

OpenTelemetry for Legacy Apps? Yes, You Can!

This article is a follow-up to my previous article posted on the OpenTelemetry Blog, "Your Critical Legacy App ...

UCC Framework: Discover Developer Toolkit for Building Technology Add-ons

The Next-Gen Toolkit for Splunk Technology Add-on Development The Universal Configuration Console (UCC) ...

.conf25 Community Recap

Hello Splunkers, And just like that, .conf25 is in the books! What an incredible few days — full of learning, ...