Getting Data In

Tail Read error

rohanaik19
Engager

I have a server where logs are generated on daily basis in this format-

/ABC/DEF/XYZ/xyz17012022.zip      /ABC/DEF/XYZ/xyz16012022.zip            /ABC/DEF/XYZ/xyz15012022.zip

OR 

/ABC/DEF/RST/rst17012022.gz      /ABC/DEF/RST/rst16012022.gz               /ABC/DEF/RST/rst15012022.gz

 

I am getting this error , every time when i am indexing the .gz, .tar or .zip  file - "updated less than 10000ms ago, will not read it until it stops changing ; has stopped changing , will read it now."

This problem was earlier addressed in this post, 

https://community.splunk.com/t5/Developing-for-Splunk-Enterprise/gz-file-not-getting-indexed-in-splu...

As suggested I have used " crcSalt = <SOURCE> " but I am still facing similar errors.  

inputs.conf:

 [monitor:///ABC/DEF/XYZ/xyz*.zip]
index= log_critical
disabled = false
sourcetype= Critical_XYZ
ignoreOlderThan = 2d
crcSalt = <SOURCE>

I am getting this Event in Internal Logs while ingesting the log file

rohanaik19_0-1642752455696.png

 

 

Labels (3)
0 Karma

richgalloway
SplunkTrust
SplunkTrust

Those are not errors.  They're informational messages and can be ignored safely.

---
If this reply helps you, Karma would be appreciated.
0 Karma

rohanaik19
Engager

I am not able to pick these compressed files and these file are not getting indexed, Is there anything wrong with my configuration?

0 Karma

PickleRick
SplunkTrust
SplunkTrust

Handling compressed files requires special care from splunk.

In case of normal text file you can usually just read the file, remember the position within the file and then read later from that point on, possibly combining the read data with some unprocessed chunk from previous read. Relatively easy.

With compressed files it's not that easy. Especially with zipped files. I won't go into details too deep (especially that I don't know how exactly splunk does it ;-)) but in general - if the file is constantly changing there's no point in uncompressing it because it is probably still in the middle of the compression process and you might get erroneous data from unpack attempt. And (de)compression is somehow cpu-intensive operation. So splunk waits until the file "seems finished".

But the other mesage suggests that spkunk had already decided that the file is ok to go and starts procesing it so if you're not getting any data from it the misconfiguration might be elsewhere.

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Take Action Automatically on Splunk Alerts with Red Hat Ansible Automation Platform

 Are you ready to revolutionize your IT operations? As digital transformation accelerates, the demand for ...

Calling All Security Pros: Ready to Race Through Boston?

Hey Splunkers, .conf25 is heading to Boston and we’re kicking things off with something bold, competitive, and ...

Beyond Detection: How Splunk and Cisco Integrated Security Platforms Transform ...

Financial services organizations face an impossible equation: maintain 99.9% uptime for mission-critical ...