- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
CSV Headers are listing as events and not extracting into interesting fields . This is the props.conf I'm using
Header ex: xys ,queue, monitor, tags, like this 20 header feilds .
props.conf am using :
[ csv ]
CHARSET=UTF-8
INDEXED_EXTRACTIONS=csv
KV_MODE=none
SHOULD_LINEMERGE=false
category=Structured
description=Comma-separated value format. Set header and other settings in "Delimited Settings"
disabled=false
pulldown_type=true
HEADER_FIELD_LINE_NUMBER=1
Please help me in fixing the props.conf . In case if i need to try using transforms.conf , suggest the transforms settings .
Thanks
Guru.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Try this (removed spaces from stanza header, upper-cased some settings, removed others)
[csv]
CHARSET=UTF-8
INDEXED_EXTRACTIONS=CSV
description=Comma-separated value format. Set header and other settings in "Delimited Settings"
HEADER_FIELD_LINE_NUMBER=1
TIMESTAMP_FIELDS=LIST,YOUR,FIELDS,HERE
Put this on your FORWARDER (NOT ON YOUR INDEXERS) and restart splunkd there. This will only effect events forwarded AFTER the restart (old events will stay broken).
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Try this (removed spaces from stanza header, upper-cased some settings, removed others)
[csv]
CHARSET=UTF-8
INDEXED_EXTRACTIONS=CSV
description=Comma-separated value format. Set header and other settings in "Delimited Settings"
HEADER_FIELD_LINE_NUMBER=1
TIMESTAMP_FIELDS=LIST,YOUR,FIELDS,HERE
Put this on your FORWARDER (NOT ON YOUR INDEXERS) and restart splunkd there. This will only effect events forwarded AFTER the restart (old events will stay broken).
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks Woodcock.
I have universal forwarder on server end from where the logs are sent to the splunk instance(standalone) . Since UF are not full splunk instance , doesn't have capabilities to data parse . I'm trying to do index time field extractions .
I'm making changes props.conf changes on my indexer.
regards,
Guru.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Woodcock ,
Thanks for your guidance .
The previous props.conf setup at every forwarder node enabled header field extraction .
But now i see Header fields extracted into interesting Fields along with same header field listed as events along with the other events(rows) . I'm monitoring 4 identical csv files with similar header fields in it .
ex : Intresting fields > header " xyz" > values - mlm, mlx , mlr , xyz.
I have to query xyz!=xyz to remove them from the events .Plz suggest to remove the header fields form the events .
Thanks In Advance .
GURU.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Is this for events that are newly indexed after the change, or for the ones indexed before that?
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
DalJeanis ,
I see this happen with newly indexed data after the header field extraction .Plz let me know if you need any further info
Thanks
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

I suspect that what you are seeing is the old/bad events, merged with the new/correct ones. You can hide the old events with a search like this:
Your Search That Shows Headers Or Other Broken Events | delete
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Woodcock,
Thanks you .
the rolling data comes from a shared drive from 10 different hosts which are acting as deployment clients , which has same identical data with same headers for each file.
I see that headers are being indexed from all the hosts and see equal number of events count as rows. ex : header event count : 70 , row events - 70 .
my inputs is like below :
monitor path : ///m3/logs/csv_*.log
sourcetype=jkj
index=kjkj
crcSalt=
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

I am a bit lost. Perhaps you would be best served by starting over with a new question.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks Woodcock. Sure .
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

You do not understand; the INDEXED_EXTRACTIONS
function ONLY works on the forwarding node. You MUST put it there; it will do nothing anywhere else.
You have to put this on every Forwarder and then restart all splunk instances there; read about this caveat here:
http://docs.splunk.com/Documentation/Splunk/6.0/Data/Extractfieldsfromfileheadersatindextime#Caveats
Just try it.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks alot Woodcock . It works now .
Appreciate your answers .
