Getting Data In

Multiline field in modular input getting newline removed while indexed

scottsavareseat
Path Finder

I am creating a modular input. My input is a CSV and I convert it to JSON to be imported as a new event in Splunk. Several of the fields have newlines in the data. However, once indexed the newlines are removed. Here is the code that does it:

    csvdata = [row for row in csv.reader(data.splitlines())]
    header = csvdata.pop(0)
    for row in csvdata:
        e = {}
        for col, val in zip( header, row ):
            col = col.replace( " ", "_" )
            e[col] = val
        event_time = calendar.timegm(time.strptime(e["timefield"], time_pattern))
        event = helper.new_event(data=json.dumps(e), time=event_time, index=index, unbroken=True)
        ew.write_event(event)

One thing I've tried is adding the SHOULD_LINEMERGE=0 to props.conf which didn't work. Is there a way to tell Splunk not to remove the newlines from fields?

Thanks!

0 Karma

scottsavareseat
Path Finder

I'm going to mark this as resolved.

The problem wasn't during indexing. It was actually here:

     csvdata = [row for row in csv.reader(data.splitlines())]

It mishandled the newlines. Getting rid of that and spliting on "\r\n" solved the problem

0 Karma
Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Dynamic formatting from XML events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Enter the Agentic Era with Splunk AI Assistant for SPL 1.4

  🚀 Your data just got a serious AI upgrade — are you ready? Say hello to the Agentic Era with the ...

Stronger Security with Federated Search for S3, GCP SQL & Australian Threat ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...