Getting Data In

How to avoid monitored files being re-created often?

PickleRick
SplunkTrust
SplunkTrust

Ehh, I have an annoying case.

I'm monitoring a file over windows share (to make things even worse to troubleshoot is that I don't have direct access to the share from my administrative user; only the domain user the UF is running with has access).

The file is a CSV, it's getting properly split into fields, the date is getting parsed OK. I have transforms for removing the header (and a footer - this file has some footer as well). And this works mostly well.

Mostly, because every time there is data added to the file, the file is apparently getting recreated from scratch - new data is inserted before footer and I'm getting entries like

02-15-2022 10:55:23.008 +0100 INFO WatchedFile - File too small to check seekcrc, probably truncated. Will re-read entire file='\\path\to\the\file'

Luckily, for now the file is relatively small (some 3k lines) and doesn't eat up much license compared to this customer's other sources but it's annoying that the same events are getting ingested several times during the day.

The problem is that I don't see any reasonable way to avoid it. There is no deduplication functionality on input, I don't have any "buffer" I could compare it with using ingest-time eval or something like that.

Any aces up your sleeves? 😉

Labels (2)
0 Karma
Get Updates on the Splunk Community!

October Community Champions: A Shoutout to Our Contributors!

As October comes to a close, we want to take a moment to celebrate the people who make the Splunk Community ...

Community Content Calendar, November Edition

Welcome to the November edition of our Community Spotlight! Each month, we dive into the Splunk Community to ...

Stay Connected: Your Guide to November Tech Talks, Office Hours, and Webinars!

What are Community Office Hours? Community Office Hours is an interactive 60-minute Zoom series where ...