Getting Data In

Batch input not working. Please help troubleshoot it

dm1
Contributor

There is a csv file I had added to a a directory which HF monitors.

That input is set as Batch input.

Because there was some issue with the data was getting formatted, I deleted the results from the search head using | delete command.

After that to re-ingest, I followed same procedue to reingest the csv file.

After the file is added to the directory, it gets deleted due to the move to sink hole policy.

However, when I do a search for the same log, nothing shows up.

Can someone please help why this is happening and how it can be fixed ?

Labels (2)
Tags (1)
0 Karma
1 Solution

dm1
Contributor

Adding the below setting in batch stanza within inputs.conf helped me re-ingest the same file

 

initCrcLength = 1028

 

FYI, the value cannot be less than 256 or more than 1048576.

 

View solution in original post

0 Karma

dm1
Contributor

Adding the below setting in batch stanza within inputs.conf helped me re-ingest the same file

 

initCrcLength = 1028

 

FYI, the value cannot be less than 256 or more than 1048576.

 

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @dm1,

Good for you, see next time!

Ciao and Happy splunking.

Giuseppe

P.S.: Karma points are appreciated by all the Contributors 😉

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @dm1,

by default, Splunk doesn't permit to index a file twice.

So if you deleted the logs from a file in Splunk, to reindex them you have two options:

  • index it manually using the guided procedure [Settings -- Add Data];
  • change the name of the file, modify your inputs.conf stanza adding "crcSal = <SOURCE>" and restart Splunk on HF.

Ciao.

Giuseppe

0 Karma

dm1
Contributor

Thanks for your reply @gcusello . I have posted the solution that helped fix my issue.

0 Karma
Get Updates on the Splunk Community!

Message Parsing in SOCK

Introduction This blog post is part of an ongoing series on SOCK enablement. In this blog post, I will write ...

Exploring the OpenTelemetry Collector’s Kubernetes annotation-based discovery

We’ve already explored a few topics around observability in a Kubernetes environment -- Common Failures in a ...

Use ‘em or lose ‘em | Splunk training units do expire

Whether it’s hummus, a ham sandwich, or a human, almost everything in this world has an expiration date. And, ...