Getting Data In

Tail Read error

rohanaik19
Engager

I have a server where logs are generated on daily basis in this format-

/ABC/DEF/XYZ/xyz17012022.zip      /ABC/DEF/XYZ/xyz16012022.zip            /ABC/DEF/XYZ/xyz15012022.zip

OR 

/ABC/DEF/RST/rst17012022.gz      /ABC/DEF/RST/rst16012022.gz               /ABC/DEF/RST/rst15012022.gz

 

I am getting this error , every time when i am indexing the .gz, .tar or .zip  file - "updated less than 10000ms ago, will not read it until it stops changing ; has stopped changing , will read it now."

This problem was earlier addressed in this post, 

https://community.splunk.com/t5/Developing-for-Splunk-Enterprise/gz-file-not-getting-indexed-in-splu...

As suggested I have used " crcSalt = <SOURCE> " but I am still facing similar errors.  

inputs.conf:

 [monitor:///ABC/DEF/XYZ/xyz*.zip]
index= log_critical
disabled = false
sourcetype= Critical_XYZ
ignoreOlderThan = 2d
crcSalt = <SOURCE>

I am getting this Event in Internal Logs while ingesting the log file

rohanaik19_0-1642752455696.png

 

 

Labels (3)
0 Karma

richgalloway
SplunkTrust
SplunkTrust

Those are not errors.  They're informational messages and can be ignored safely.

---
If this reply helps you, Karma would be appreciated.
0 Karma

rohanaik19
Engager

I am not able to pick these compressed files and these file are not getting indexed, Is there anything wrong with my configuration?

0 Karma

PickleRick
SplunkTrust
SplunkTrust

Handling compressed files requires special care from splunk.

In case of normal text file you can usually just read the file, remember the position within the file and then read later from that point on, possibly combining the read data with some unprocessed chunk from previous read. Relatively easy.

With compressed files it's not that easy. Especially with zipped files. I won't go into details too deep (especially that I don't know how exactly splunk does it ;-)) but in general - if the file is constantly changing there's no point in uncompressing it because it is probably still in the middle of the compression process and you might get erroneous data from unpack attempt. And (de)compression is somehow cpu-intensive operation. So splunk waits until the file "seems finished".

But the other mesage suggests that spkunk had already decided that the file is ok to go and starts procesing it so if you're not getting any data from it the misconfiguration might be elsewhere.

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...