Archive

Newbie Question - CSV with header

msarro
Builder

Hey everyone. First, thanks for helping with all of my newbie questions, I really appreciate it. Right now I am trying to feed .CSV files into splunk. Each csv is set up in the following format:

TIMESTAMP,HEADERITEM1,...,LASTHEADER
R1TIMESTAMP,R1DATA2,...,R1LASTDATA
R2TIMESTAMP,R2DATA2,...,R2LASTDATA
R3TIMESTAMP,R3DATA2,...,R3LASTDATA
R4TIMESTAMP,R4DATA2,...,R4LASTDATA

I am trying to remove or just ignore the header line, but it still keeps getting indexed. I have set up my props.conf file to look like this:

[sip-acl]
REPORT-sipaclparse=sip-acl_parse
TRANSFORMS-null=setnullsip-acl

And the transforms.conf file to look like this:

[sip-acl_parse]
DELIMS=","
FIELDS="TIMESTAMP", "HEADERITEM1", ... ,"LASTHEADER"


[setnullsip-acl]
REGEX=TIMESTAMP,HEADERITEM1
DEST_KEY=nullQueue
FORMAT=nullQueue
AT=nullQueue

Can anyone tell me what I'm doing wrong? I'd appreciate the help. Thanks!

0 Karma
1 Solution

Lowell
Super Champion

Looks like your transformer your using to drop the header isn't quite right. Try this instead:

[setnullsip-acl]
REGEX = ^TIMESTAMP,HEADERITEM1
DEST_KEY=queue
FORMAT=nullQueue

I'm assuming that your regex is correct. I recommend using an external regex testing utility for this kind of thing. I use one all the time and it has saved me from tons of headaches.

View solution in original post

0 Karma

Lowell
Super Champion

Looks like your transformer your using to drop the header isn't quite right. Try this instead:

[setnullsip-acl]
REGEX = ^TIMESTAMP,HEADERITEM1
DEST_KEY=queue
FORMAT=nullQueue

I'm assuming that your regex is correct. I recommend using an external regex testing utility for this kind of thing. I use one all the time and it has saved me from tons of headaches.

View solution in original post

0 Karma

msarro
Builder

That worked perfectly, thank you so much!

0 Karma
.conf21 CFS Extended through 5/20!

Don't miss your chance
to share your Splunk
wisdom in-person or
virtually at .conf21!

Call for Speakers has
been extended through
Thursday, 5/20!