Getting Data In

How do I configure data inputs for .csv files with dynamic field headers for a new event on each line

Path Finder

Running 4.2.1, we are monitoring many csv files that differ on listed fields. We have splunk configured to dynamically read the header row for field names. (props.conf: CHECK_FOR_HEADER=TRUE) and this works brilliantly! However we are not seeing the events split correctly - splunk is indexing 256 rows to one event. This is a .csv file with a clear event new line separation...

Has anyone else done this successfully?

Any ideas?

Tags (1)
0 Karma

Splunk Employee
Splunk Employee

Most likely, Splunk is not detecting a timestamp in your rows. The default rule for Splunk is to merge lines together (SHOULD_LINEMERGE = true), but to split them whenever it detects a date (BREAK_ONLY_BEFORE_DATE = true). The easiest and best way to break on newlines is to simply set SHOULD_LINEMERGE = false, but if there are dates in your data and Splunk isn't finding them, you should also set TIME_FORMAT and TIME_PREFIX and maybe MAX_TIMESTAMP_LOOKAHEAD.

0 Karma


By default Splunk will merge lines in incoming logs and then break them up according to certain rules. This behavior is controlled by the SHOULD_LINEMERGE directive in props.conf (default is true). Setting SHOULD_LINEMERGE to false will tell Splunk not to combine several lines into a single event, which will give you the behavior you want.

Did you miss .conf21 Virtual?

Good news! The event's keynotes and many of its breakout sessions are now available online, and still totally FREE!