Getting Data In

Multiline field in modular input getting newline removed while indexed

scottsavareseat
Path Finder

I am creating a modular input. My input is a CSV and I convert it to JSON to be imported as a new event in Splunk. Several of the fields have newlines in the data. However, once indexed the newlines are removed. Here is the code that does it:

    csvdata = [row for row in csv.reader(data.splitlines())]
    header = csvdata.pop(0)
    for row in csvdata:
        e = {}
        for col, val in zip( header, row ):
            col = col.replace( " ", "_" )
            e[col] = val
        event_time = calendar.timegm(time.strptime(e["timefield"], time_pattern))
        event = helper.new_event(data=json.dumps(e), time=event_time, index=index, unbroken=True)
        ew.write_event(event)

One thing I've tried is adding the SHOULD_LINEMERGE=0 to props.conf which didn't work. Is there a way to tell Splunk not to remove the newlines from fields?

Thanks!

0 Karma

scottsavareseat
Path Finder

I'm going to mark this as resolved.

The problem wasn't during indexing. It was actually here:

     csvdata = [row for row in csv.reader(data.splitlines())]

It mishandled the newlines. Getting rid of that and spliting on "\r\n" solved the problem

0 Karma
Get Updates on the Splunk Community!

Don't wait! Accept the Mission Possible: Splunk Adoption Challenge Now and Win ...

Attention everyone! We have exciting news to share! We are recruiting new members for the Mission Possible: ...

Unify Your SecOps with Splunk Mission Control

In today’s post, I'm excited to share some recent Splunk Mission Control innovations. With Splunk Mission ...

Data Preparation Made Easy: SPL2 for Edge Processor

By now, you may have heard the exciting news that Edge Processor, the easy-to-use Splunk data preparation tool ...