Getting Data In

Importing a CSV and line breaking within a field

diesel6e
New Member

Hi Ninjas-
I am trying to import a csv that is in the following format, with a header that defines the fields-

field a | field b | field c | field d | "description field" | field f | field g

There is no timestamp field, its not required.

I am able to import the CSV, and event breaking is working fine for the header, and a few of the events.

However, some of the description fields have multiple line breaks e.g.

"Description of event blah...

... blah...

blah..
blah.."

However they are all contained within the " "'s within the | delimeters.

I have the following in props.conf (in the import wizard,without spaces after the _ )

[props]

sourcetype = csvimport

SHOULD_ LINEMERGE= true

CHECK_ FOR_ HEADER = TRUE

FIELD_ DELIMITER= |

FIELD_ QUOTE = "

MUST_BREAK_AFTER = $

I have tried using the below two statements to try and bypass the event breaking, perhaps the regex is not working?



MUST_ NOT_ BREAK_ AFTER = |"

MUST_ NOT_ BREAK_ BEFORE = "|

Is there a better way to do this?

Thanks

Tags (1)
0 Karma

richgalloway
SplunkTrust
SplunkTrust

We've done that using a REGEX statement in our transforms.conf.

[csv-transform]
REGEX = (?<fieldA>.*?) | (?<fieldB>.*?) | (?<fieldC>.*?) | (?<fieldD>.*?) | \"(?<description>.*?)\" | (?<fieldF>.*?)

You must also reference this transform in your props.conf. Ours looks like this.

[csv-breaktest]
BREAK_ONLY_BEFORE = (regex unique to our first field)
CHECK_FOR_HEADER = false
KV_MODE = multi
MAX_TIMESTAMP_LOOKAHEAD = 50
NO_BINARY_CHECK = true
SHOULD_LINEMERGE = true
REPORT-csv = csv-transform
---
If this reply helps you, Karma would be appreciated.
0 Karma
Get Updates on the Splunk Community!

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...