Getting Data In

Monitor csv files directory Tail Reader Problem

marcos_eng1
Explorer

I am monitoring a directory with 101 csv file with the same format but I am having only 49 of them indexed.  When I start up the splunk I get warn message from TailReader - Could not send data to output queue (parsingqueue), retrying....

Sample of csv files:

Timestamp,Value (%)
21-Sep-20 6:38:00 AM BRT,0.0
21-Sep-20 6:39:00 AM BRT,0.0
21-Sep-20 6:40:00 AM BRT,0.0
21-Sep-20 6:41:00 AM BRT,0.0
21-Sep-20 6:42:00 AM BRT,0.0
21-Sep-20 6:43:00 AM BRT,0.0
21-Sep-20 6:44:00 AM BRT,0.0
21-Sep-20 6:45:00 AM BRT,0.0
21-Sep-20 6:46:00 AM BRT,0.0
21-Sep-20 6:47:01 AM BRT,0.0

Timestamp,Value (%)
21-Sep-20 6:38:00 AM BRT,0.0
21-Sep-20 6:39:00 AM BRT,0.0
21-Sep-20 6:40:00 AM BRT,0.0
21-Sep-20 6:41:00 AM BRT,0.0
21-Sep-20 6:42:00 AM BRT,0.0
21-Sep-20 6:43:00 AM BRT,0.0
21-Sep-20 6:44:00 AM BRT,0.0
21-Sep-20 6:45:00 AM BRT,0.0
21-Sep-20 6:46:00 AM BRT,0.0
21-Sep-20 6:47:01 AM BRT,0.0

 


[porto_file_csv]
BREAK_ONLY_BEFORE_DATE =
DATETIME_CONFIG =
HEADER_FIELD_LINE_NUMBER = 13
INDEXED_EXTRACTIONS = csv
KV_MODE = none
LINE_BREAKER = ([\r\n]+)
NO_BINARY_CHECK = true
SHOULD_LINEMERGE = false
category = Structured
description = Comma-separated value format. Set header and other settings in "Delimited Settings"
disabled = false
pulldown_type = true
EXTRACT-Chiller,Variavel = /opt/POC_Chiller/POC_(?P<Chiller>CH\d)_(?P<Variavel>\w+) in source
REPORT-poc_porto = REPORT-poc_porto

 

[monitor:///opt/POC_Chiller]
disabled = false
host = test4
index = test_porto
sourcetype = porto_file_csv

 

Note: I also have tried to monitor the files with default csv sourcetype and again it didn't work.

Any help, would very appreciated!

Marcos Pereira

 

 

 

 

Labels (1)
0 Karma
1 Solution

alonsocaio
Contributor

Are you generating new files with the same name? Or just updating Its content?

Looking at the error on the internal log you provided, I would try testing the crcSalt option on your monitoring input stanza (If file name keeps changing and all new files are created with a different name).

crcSalt = <SOURCE>

 If you create files using the same file name (replacing them, instead of updating), I would try increasing the initCrcLength option. The default value is 256

initCrcLength = <INTEGER>

Both options are from inputs.conf. Also, If you need, refer to this doc to get more information about the two mentioned options: https://docs.splunk.com/Documentation/Splunk/8.1.0/Admin/Inputsconf

View solution in original post

alonsocaio
Contributor

Hi @marcos_eng1 

Are you using an Universal Forwarder or a Heavy Forwarder instance for monitoring this csv files? Or the input stanza is on another instance?

Also, what is the size of these files?

0 Karma

marcos_eng1
Explorer

Hello @alonsocaio

I am using the inputs.conf in a standalone server.

Please, see my internal logs related the tailreader fail:

10-27-2020 16:02:40.320 -0300 ERROR TailReader - File will not be read, seekptr checksum did not match (file=/opt/POC_Chiller/POC_CH1_CAP_TOTAL_B.csv). Last time we saw this initcrc, filename was different. You may wish to use larger initCrcLen for this sourcetype, or a CRC salt on this source. Consult the documentation or file a support case online at http://www.splunk.com/page/submit_issue for more info.


10-27-2020 15:03:14.788 -0300 INFO WatchedFile - File too small to check seekcrc, probably truncated. Will re-read entire file='/opt/POC_Chiller/POC_CH1_CAP_TOTAL_B.csv'.

 

 

 

0 Karma

alonsocaio
Contributor

Are you generating new files with the same name? Or just updating Its content?

Looking at the error on the internal log you provided, I would try testing the crcSalt option on your monitoring input stanza (If file name keeps changing and all new files are created with a different name).

crcSalt = <SOURCE>

 If you create files using the same file name (replacing them, instead of updating), I would try increasing the initCrcLength option. The default value is 256

initCrcLength = <INTEGER>

Both options are from inputs.conf. Also, If you need, refer to this doc to get more information about the two mentioned options: https://docs.splunk.com/Documentation/Splunk/8.1.0/Admin/Inputsconf

marcos_eng1
Explorer

I worked.....Thanks @alonsocaio 

0 Karma

marcos_eng1
Explorer

@alonsocaio 

 

Also the csv files are not bigger than 17KB

0 Karma
Get Updates on the Splunk Community!

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...

Cloud Platform & Enterprise: Classic Dashboard Export Feature Deprecation

As of Splunk Cloud Platform 9.3.2408 and Splunk Enterprise 9.4, classic dashboard export features are now ...