Getting Data In

Multiline field in modular input getting newline removed while indexed

scottsavareseat
Path Finder

I am creating a modular input. My input is a CSV and I convert it to JSON to be imported as a new event in Splunk. Several of the fields have newlines in the data. However, once indexed the newlines are removed. Here is the code that does it:

    csvdata = [row for row in csv.reader(data.splitlines())]
    header = csvdata.pop(0)
    for row in csvdata:
        e = {}
        for col, val in zip( header, row ):
            col = col.replace( " ", "_" )
            e[col] = val
        event_time = calendar.timegm(time.strptime(e["timefield"], time_pattern))
        event = helper.new_event(data=json.dumps(e), time=event_time, index=index, unbroken=True)
        ew.write_event(event)

One thing I've tried is adding the SHOULD_LINEMERGE=0 to props.conf which didn't work. Is there a way to tell Splunk not to remove the newlines from fields?

Thanks!

0 Karma

scottsavareseat
Path Finder

I'm going to mark this as resolved.

The problem wasn't during indexing. It was actually here:

     csvdata = [row for row in csv.reader(data.splitlines())]

It mishandled the newlines. Getting rid of that and spliting on "\r\n" solved the problem

0 Karma
Get Updates on the Splunk Community!

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...

Splunk MCP & Agentic AI: Machine Data Without Limits

Discover how the Splunk Model Context Protocol (MCP) Server can revolutionize the way your organization uses ...