Getting Data In

Getting Error from TailReader

katzr
Path Finder

Hello,

I am trying to upload a .csv file through my auto-index and I am getting this error in my internal logs " -0400 ERROR TailReader - error from read call from 'G:\Data\SSRS_Subscriptions\It_SNOW_Call_Kiosk_Logs_Weekly\snow data 9-18.csv'". And the the file is not uploading into Splunk.

I have a QA environment with the same inputs.conf and props.conf information and the file uploaded just fine in my QA environment, but I am getting the same error in my internal logs. Below is my inputs.conf and props.conf lines. Can you please help me figure out why I am getting this error and my file is not getting indexed?

[monitor://G:\Data\SSRS_Subscriptions\It_SNOW_Call_Kiosk_Logs_Weekly]
whitelist = .csv$
disabled = false
index = it_snow_call_kiosk_logs_weekly
sourcetype = itcc:snow
initCrcLength = 640

[itcc:snow]
INDEXED_EXTRACTIONS = csv
TRUNCATE = 50000
SHOULD_LINEMERGE = false
TIMESTAMP_FIELDS = opened_at
TIME_FORMAT = %1m/%1d/%Y %H:%M

0 Karma

sbbadri
Motivator

check that

1)file or folder have proper read/write permission to splunk id on that box.
2) Also check, [monitor://G:\Data\SSRS_Subscriptions\It_SNOW_Call_Kiosk_Logs_Weekly\*.csv]
3) Try to remove initCrcLength=640 - If header crosses 640 characters splunk uf won't read the file.

katzr
Path Finder

Thank you for the help! A few comments:
3- the header is ~540 characters only- so I don't believe that is the problem.

For 2- do you mean update my inputs.conf file to that line?

For 1- I have uploaded to this folder before with other files- how do I check the permissions of the file?

sbbadri
Motivator

For 2 - yes
For 3 - Login to the server and go to the location. ls -ltra read/write permission should be enabled to the folder or file.
other option is, Login to the forwarder and execute this command,

$SPLUNK_HOME$/bin/splunk btool inputs list --debug - check the inputs stanza
$SPLUNK_HOME$/bin/splunk list inputstatus -- check the your monitor stanza for type of error and info why it causing the issue.

0 Karma

katzr
Path Finder

2- I changed inputs.conf, restarted splunk, and re-dropped the file into the auto index folder and it was not indexed- I am not getting the error in the splunk internal logs anymore though.

0 Karma

sbbadri
Motivator

auto index folder means this one G:\Data\SSRS_Subscriptions\It_SNOW_Call_Kiosk_Logs_Weekly?

0 Karma

katzr
Path Finder

yes it does. So then I kept the file in the auto-index folder and edited the inputs.conf again to remove the *.csv and restated splunk again and my file was indexed- I don't know why it is operating this way.

0 Karma
Get Updates on the Splunk Community!

Building Reliable Asset and Identity Frameworks in Splunk ES

 Accurate asset and identity resolution is the backbone of security operations. Without it, alerts are ...

Cloud Monitoring Console - Unlocking Greater Visibility in SVC Usage Reporting

For Splunk Cloud customers, understanding and optimizing Splunk Virtual Compute (SVC) usage and resource ...

Automatic Discovery Part 3: Practical Use Cases

If you’ve enabled Automatic Discovery in your install of the Splunk Distribution of the OpenTelemetry ...