Getting Data In

Splunk suddenly stops ingesting data.

soumdey0192
Explorer

Hi All,

I have a scripted output file that splunk is ingesting via a heavy forwarder.

Since last few weeks, I am facing an issue like suddenly splunk stops ingesting the data eventhough the script is writing the data to the output file.

The script is configured to run every 2 minutes, and remove the previous data and write the new data onto it.

When I check the internal logs, I get the below error :

03-18-2020 12:50:01.521 +0000 ERROR TailReader - Ignoring path="/tmp/splunkDataFiles/labSanityCheck.txt" due to: Bug: tried to check/configure STData processing but have no pending metadata.

03-18-2020 12:50:01.517 +0000 ERROR TailReader - failed to compute crc for /tmp/splunkDataFiles/labSanityCheck.txt (method: 0, hint: No such file or directory).

As per some previous answers to this similar problem, I updated the CHARSET=AUTO, but that did not help.

Can somebody suggest anything regarding this issue?????

0 Karma

nakiamatthews
Explorer

I am also having this same issue. In my case the file being read contains data coming in via syslog and written to disk. In the past I have deleted the files and restarted my Universal forwarder, which worked for a while, but eventually the logs would stop and I would see the crc errors in my splunk logs.

0 Karma

woodcock
Esteemed Legend

We see this problem all the time and it is usually due to there being way too many files co-resident with the files that you are monitoring. This typically happens because there is no housekeeping, or very languishing policy for deleting the files as they rotate. Yes, even if you are not monitoring the rotated files because they do not match the pattern in your [monitor...] stanza, as they pile up, they will eventually slow the forwarder down to a crawl. It usually starts when you have hundreds of files and you are crippled by the time you get to thousands. If you cannot delete the files that are way old and done, then you can create soft links to fresh files in another directory. Let me know if you need details on how to do that.

0 Karma

soumdey0192
Explorer

Apologies for the delayed response...

The script I am talking about here deletes the existing data in the output file and overwrites the file with new data.

0 Karma

PavelP
Motivator

the script output can be buffered and not flushing to the file system immedially. This especially the case if your script produces a small output only. Try to google "flush stdout your_script_language"

0 Karma
Get Updates on the Splunk Community!

Unlock Database Monitoring with Splunk Observability Cloud

  In today’s fast-paced digital landscape, even minor database slowdowns can disrupt user experiences and ...

Purpose in Action: How Splunk Is Helping Power an Inclusive Future for All

At Cisco, purpose isn’t a tagline—it’s a commitment. Cisco’s FY25 Purpose Report outlines how the company is ...

[Upcoming Webinar] Demo Day: Transforming IT Operations with Splunk

Join us for a live Demo Day at the Cisco Store on January 21st 10:00am - 11:00am PST In the fast-paced world ...