Getting Data In

Why isn't Splunk ingesting new rows from CSV file?

williamcharlton
Path Finder

I have a 4-server Splunk scenario:

  1. index server
  2. deployment server
  3. search head server
  4. deployment client server (w/ a Splunk Universal Forwarder known to be configured correctly and working, i.e., it forwards to the index server ok)

On the index server, I placed the following 200-row csv file and successfully ingested it into the index foo_index using oneshot:

"DateTime","foo"
"10/1/2019 12:03:20 AM","cat"
.. 198 more similar rows
"10/1/2019 11:55:20 PM","dog"

After verifying that the 200 events were ingested, I edited the csv file and added 200 more rows:

"DateTime","foo"
"10/1/2019 12:03:20 AM","cat"
.. 198 more similar rows
"10/1/2019 11:55:20 PM","dog"
"10/2/2019 12:01:20 AM","mouse"
.. 198 more similar rows
"10/2/2019 11:59:59 PM","mouse"

Then, on the deployment server, I created a remote folder monitor and deployed it to the deployment client server which created \etc\apps\xxx\local\inputs.conf on the deployment client server:

[monitor://D:\foo]
disabled = false
index = foo_index
sourcetype = csv
crcSALT = <SOURCE>

Then, I copied the CSV file to D:\foo on the deployment client server (D:\foo was empty prior to my dropping the CSV file into it)

The new 200 rows were not ingested.

Why not?

0 Karma
1 Solution

williamcharlton
Path Finder

Ignore this question. As it turns out, Splunk did ingest the data. It just took many more hours (2 days, actually) that I would have expected for a few hundred rows from a csv file

View solution in original post

0 Karma

williamcharlton
Path Finder

Ignore this question. As it turns out, Splunk did ingest the data. It just took many more hours (2 days, actually) that I would have expected for a few hundred rows from a csv file

0 Karma

woodcock
Esteemed Legend

The problem is DEFINITELY NOT the fishbucket because oneshot bypasses this so any answer with crcSalt is wrong (and couldn't work unless it is camel-cased correctly). Are you sure that it did not get sent? Did you set restartSplunkd? If not, you need to do so and push it out again. Also fix this:

[monitor://D:\foo\*.csv]
0 Karma

williamcharlton
Path Finder

Ignore this question. As it turns out, Splunk did ingest the data. It just took many more hours that I would have expected for a few hundred rows from a csv file

0 Karma

woodcock
Esteemed Legend

OK, come back and either click Accept on this answer or post your own and Accept that one to close your question.

0 Karma

williamcharlton
Path Finder

Ignore this question. As it turns out, Soplunk did ingest the data. It just took many more hours that I would have expected for a few hundred rows from a csv file

0 Karma

ivanreis
Builder

when you are setup an input, you have to full path to read the file.
if the csv file name is sample.csv your input should be like this:
[monitor://D:\foo\sample.csv] OR
disabled = false
index = foo_index
sourcetype = csv
CRCSALT =

OR
this stanza [monitor://D:\foo*.csv] to index all the csv files

for further information please check this link:
https://docs.splunk.com/Documentation/Splunk/latest/Admin/Inputsconf#MONITOR:

0 Karma

williamcharlton
Path Finder

My requirement is that I monitor a folder because many csv files can be loaded to the folder, not just one.

The stanza you specified [monitor://D:\foo*.csv] does not seem to be required:

See https://docs.splunk.com/Documentation/Splunk/7.3.1/Data/Monitorfilesanddirectories:

"Specify a path to a file or directory
and the monitor processor consumes any
new data written to that file or
directory"

0 Karma

ivanreis
Builder

ok, no problem, I just mistyped this stanza
should be [monitor://D:\foo*.csv], this will take all the new csv files that should be added to this folder.
Thanks for your feedback.

0 Karma

williamcharlton
Path Finder

Ignore this question. As it turns out, Splunk did ingest the data. It just took many more hours that I would have expected for a few hundred rows from a csv file

0 Karma

williamcharlton
Path Finder

My point is that Splunk docs state that you do not need to specify a file name or filename wildcards to monitor for any file in a folder. All you need is the folder name:

[monitor://D:\SplunkSensorLogs]

0 Karma
Get Updates on the Splunk Community!

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...

Cloud Platform & Enterprise: Classic Dashboard Export Feature Deprecation

As of Splunk Cloud Platform 9.3.2408 and Splunk Enterprise 9.4, classic dashboard export features are now ...