I have some csv files that have 30+ columns and I cannot get splunk to ingest them. I keep getting crc errors. I've tried to use crcSalt and initCrcLength but I keep getting the same error message below in splunkd.log. The files can be very similar sometimes. Anyone have any ideas?
splunkd.log
05-23-2017 16:15:43.313 -0400 ERROR TailReader - File will not be read, seekptr checksum did not match (file=*****\Public\Test\test.csv). Last time we saw this initcrc, filename was different. You may wish to use larger initCrcLen for this sourcetype, or a CRC salt on this source. Consult the documentation or file a support case online at http://www.splunk.com/page/submit_issue for more info.
inputs.conf stanza
[batch://\****\Public\Test\*.csv]
crcSalt = <SOURCE>
move_policy = sinkhole
sourcetype = pub_audit
index = main
DATETIME_CONFIG = CURRENT
The only way I was able to get this to work was to use the CHECK_METHOD = entire_md5 in the props.conf. That is ok because my files are always small.
The only way I was able to get this to work was to use the CHECK_METHOD = entire_md5 in the props.conf. That is ok because my files are always small.
Do as @somesoni2 says and clean up the garbage, deploy this inputs.conf
file to your forwarder, restart all splunk instances on that forwarder, make sure that the files are disappearing (if not, then they are not being forwarded). MAKE SURE that each file has a different name that is never, ever, ever, ever recycled (that is what the crcSalt = <SOURCE>
line does).
Also, your quadruple-asterisks is quite strange; why is it that way? Are you missing backslash characters between them? Are you trying to match a file that literally has a string of asterisks for a directory name? Are you trying to match directories that are 4-characters long?
Give this a try
[batch://\****\Public\Test\*.csv]
crcSalt = <SOURCE>
initCrcLength = 4999
move_policy = sinkhole
sourcetype = pub_audit
index = main
The DATETIME_CONFIG attribute is a props.conf property, not inputs.conf.