Getting Data In

Indexed CSV wont line break?

jdunlea_splunk
Splunk Employee
Splunk Employee

Im indexing a CSV file and i have SHOULD_LINEMERGE set to "false" so it will break after each new line.

However per 24 hour period (and about 600,000 events), I get ~50 events which are not line broken correctly and have half of the event as a new event - How is this even happening if I have SHOULD_LINEMERGE=false? Isnt the default to break at a new line?

The only think I am thinking is that a small subset of the events in the CSV are broken over two lines? (If that's even possible) Or is there a limit to the amount of characters that Splunk will check for a line break, before it just breaks the event at the limit?? So basically meaning that we had a few very long entries in the CSV file which Splunk didn't check all the way to the end due to a limit of some sort??

Tags (4)
0 Karma

jt_splunk
Explorer

The only time I've run into this is when the application that generated the csv file had corrupt data coming in. Can you post a sample of your data?

If you're sure your dataset is clean, you may want to look at enabling SHOULD_LINEMERGE and then tweaking MUST_NOT_BREAK_BEFORE discussed here: http://docs.splunk.com/Documentation/Splunk/latest/Data/Indexmulti-lineevents.

0 Karma
Get Updates on the Splunk Community!

Using Machine Learning for Hunting Security Threats

WATCH NOW Seeing the exponential hike in global cyber threat spectrum, organizations are now striving more for ...

Observability Newsletter Highlights | March 2023

 March 2023 | Check out the latest and greatestSplunk APM's New Tag Filter ExperienceSplunk APM has updated ...

Security Newsletter Updates | March 2023

 March 2023 | Check out the latest and greatestUnify Your Security Operations with Splunk Mission Control The ...