Splunk Search

csv file header (1st tow) field extraction

sim_tcr
Communicator

Hello,

I have a csv file in below format,

date,time,rundate
02/09/2016,00:00.0,02/07/2016
02/09/2016,00:00.0,02/07/2016
02/09/2016,00:00.0,02/07/2016
02/09/2016,00:00.0,02/07/2016

What should be the props.conf on indexers look like, so that each item in first row (separated by ,) should come as fields and items under that get indexed under that fields name?

Already tried,

[sourcetype-name]
INDEXED_EXTRACTIONS = csv
HEADER_FIELD_LINE_NUMBER = 1
HEADER_FIELD_DELIMITER = ,

Thanks,
Simon Mandy

Tags (1)
0 Karma
1 Solution

woodcock
Esteemed Legend

Note that this props.conf must be deployed TO THE FORWARDER, not to the indexers because INDEXED_EXTRACTIONS is a very special case.

Perhaps you are misunderstanding how to verify if "it works". Once this goes to the forwarders and all of the splunk instances there have been bounced, ONLY events forwarded/indexed after the restart will be using the new configurations; previous data will stay wrong forever.

View solution in original post

woodcock
Esteemed Legend

Note that this props.conf must be deployed TO THE FORWARDER, not to the indexers because INDEXED_EXTRACTIONS is a very special case.

Perhaps you are misunderstanding how to verify if "it works". Once this goes to the forwarders and all of the splunk instances there have been bounced, ONLY events forwarded/indexed after the restart will be using the new configurations; previous data will stay wrong forever.

lukejadamec
Super Champion

I tested this log file content:

date,time,rundate
02/09/2016,00:00.0,2/07/2016
02/09/2016,00:00.1,2/07/2016
02/09/2016,00:00.2,2/07/2016
02/09/2016,00:00.3,2/07/2016
02/09/2016,00:00.4,2/07/2016
02/09/2016,00:00.5,2/07/2016
02/09/2016,00:00.6,2/07/2016
02/09/2016,00:00.7,2/07/2016

With these config files:

inputs.conf
[monitor://C:\temp\Splunk\test\csv-test\csv-test3.csv]
disabled = false
index = test
sourcetype = csvtest3

props.conf
[csvtest3]
NO_BINARY_CHECK = true
category = Custom
disabled = false
pulldown_type = true
REPORT-csvtest3 = REPORT-csvtest3

transforms.conf
[REPORT-csvtest3]
DELIMS = ","
FIELDS = "Date","Time","runDate"

Everything works fine, with the exception of the fractional minutes - strptime cannot compute HH:MM.M so you will get HH:MM:SS.SSS truncated to MM as _time for each event, i.e. log time 12:00.9 will equal event time 12:00:00.000.

If seconds are important, then you should ask another question on how to convert the Time field (string value extracted above) in a search to a time value that includes accurate seconds for sorting purposes.

0 Karma

lukejadamec
Super Champion

Is the time really HH:MM.M? There is no strptime variable for MM.M, so the best you're going to get is HH:MM

ex: strptime cannot compute 00:00.5 = 00:00:30, you can get 00:00.5 to represent as 00:00:05 but that is not accurate.

0 Karma

PPape
Contributor

It should work like this:

[ mycsv ]
CHARSET=AUTO
INDEXED_EXTRACTIONS=csv
KV_MODE=none
SHOULD_LINEMERGE=false
category=Structured
disabled=false
pulldown_type=true
HEADER_FIELD_LINE_NUMBER=1
FIELD_DELIMITER=,

Can you give an example of your csv, inputs.conf and props.conf?

0 Karma

sim_tcr
Communicator

It did not work

inputs.conf

[monitor://L:\Logs\csv\*.csv]
sourcetype = uow_fitcap_csv
index = sandbox
disabled = false
crcSalt=<SOURCE>

outputs.conf

[ uow_fitcap_csv  ]
CHARSET=AUTO
INDEXED_EXTRACTIONS=csv
KV_MODE=none
SHOULD_LINEMERGE=false
category=Structured
disabled=false
pulldown_type=true
HEADER_FIELD_LINE_NUMBER=1
FIELD_DELIMITER=,
0 Karma

PPape
Contributor

wich version of splunk are you running?

and is it your outputs.conf or props.conf?

0 Karma

sim_tcr
Communicator

we are on 6.3.3
mistakenly said outputs.conf, it is props.conf

0 Karma

sundareshr
Legend

Try this

[ csv ]
SHOULD_LINEMERGE=false
INDEXED_EXTRACTIONS=csv
TIMESTAMP_FIELDS=rundate,time
0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Index This | What travels the world but is also stuck in place?

April 2026 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Discover New Use Cases: Unlock Greater Value from Your Existing Splunk Data

Realizing the full potential of your Splunk investment requires more than just understanding current usage; it ...

Continue Your Journey: Join Session 2 of the Data Management and Federation Bootcamp ...

As data volumes continue to grow and environments become more distributed, managing and optimizing data ...