Getting Data In

Indexed CSV wont line break?

jdunlea_splunk
Splunk Employee
Splunk Employee

Im indexing a CSV file and i have SHOULD_LINEMERGE set to "false" so it will break after each new line.

However per 24 hour period (and about 600,000 events), I get ~50 events which are not line broken correctly and have half of the event as a new event - How is this even happening if I have SHOULD_LINEMERGE=false? Isnt the default to break at a new line?

The only think I am thinking is that a small subset of the events in the CSV are broken over two lines? (If that's even possible) Or is there a limit to the amount of characters that Splunk will check for a line break, before it just breaks the event at the limit?? So basically meaning that we had a few very long entries in the CSV file which Splunk didn't check all the way to the end due to a limit of some sort??

Tags (4)
0 Karma

jt_splunk
Explorer

The only time I've run into this is when the application that generated the csv file had corrupt data coming in. Can you post a sample of your data?

If you're sure your dataset is clean, you may want to look at enabling SHOULD_LINEMERGE and then tweaking MUST_NOT_BREAK_BEFORE discussed here: http://docs.splunk.com/Documentation/Splunk/latest/Data/Indexmulti-lineevents.

0 Karma
Get Updates on the Splunk Community!

Unlock Database Monitoring with Splunk Observability Cloud

  In today’s fast-paced digital landscape, even minor database slowdowns can disrupt user experiences and ...

Purpose in Action: How Splunk Is Helping Power an Inclusive Future for All

At Cisco, purpose isn’t a tagline—it’s a commitment. Cisco’s FY25 Purpose Report outlines how the company is ...

[Upcoming Webinar] Demo Day: Transforming IT Operations with Splunk

Join us for a live Demo Day at the Cisco Store on January 21st 10:00am - 11:00am PST In the fast-paced world ...