Splunk universal forwarder inputs.conf batch stanza is attempting to read CSV files that range in size from a 10MB to 2GB.
On the forwarder the splunkd.log shows "Stale file handle" and "CRC calculation" related warnings and errors on the larger files
i.e. 800MB and 1.4GB.
The files are not indexed and then they are deleted.
Are there hard or configured file size limits?
and/or What might cause these issues other than file size?
HI mdwecht-
File size should not really matter at all. It's all about throughput and events (IMHO).
Do your CSV files have multiple events or are they seen as one giant event? You might have to experiment with smaller chunks fo the same data in another CSV file to see how the events are appearing in Splunk.
For "Speed" - You may want to adjust this setting:
[thruput]
maxKBps =
in limits.conf file. There should be a default on on the forwarder:
$SPLUNK_HOME/etc/system/default/
But, make sure to only alter the ../local directory. If the limits.conf file is not there, just create one with the stanza I specified.
Ref: link text
For the CRC Calc errors, are you altering the file in any manner? If the file is altered before Splunk can finish reading it through, that can be an issue:
Reference: link text
Hope this helps,
Mike
The CSV files have an event per line, when the smaller files of same sourcetype are read and indexed the events look fine in splunk, universal forwarders out of the box are throughput limited so we always set maxKBps = 0. The actual warnings/errors I get are "WARN FileInputTracker - Error reading CRC: Stale file handle" and/or "ERROR WatchedFile - Error reading file". The files are not in motion when Splunk is reading them. They are delivered to a folder via nifi safe write i.e. COPIED into the folder named with a preceding "." and then MOVED to the name the Splunk inputs.conf batch stanza is looking for without the preceding ".".