Getting Data In
Highlighted

CSV Headers are listing as events and not extracting into interesting fields .

Path Finder

CSV Headers are listing as events and not extracting into interesting fields . This is the props.conf I'm using

Header ex: xys ,queue, monitor, tags, like this 20 header feilds .

props.conf am using :

[ csv ]
CHARSET=UTF-8
INDEXED_EXTRACTIONS=csv
KV_MODE=none
SHOULD_LINEMERGE=false
category=Structured
description=Comma-separated value format. Set header and other settings in "Delimited Settings"
disabled=false
pulldown_type=true
HEADER_FIELD_LINE_NUMBER=1

Please help me in fixing the props.conf . In case if i need to try using transforms.conf , suggest the transforms settings .

Thanks
Guru.

0 Karma
Highlighted

Re: CSV Headers are listing as events and not extracting into interesting fields .

Esteemed Legend

Try this (removed spaces from stanza header, upper-cased some settings, removed others)

[csv]
CHARSET=UTF-8
INDEXED_EXTRACTIONS=CSV
description=Comma-separated value format. Set header and other settings in "Delimited Settings"
HEADER_FIELD_LINE_NUMBER=1
TIMESTAMP_FIELDS=LIST,YOUR,FIELDS,HERE

Put this on your FORWARDER (NOT ON YOUR INDEXERS) and restart splunkd there. This will only effect events forwarded AFTER the restart (old events will stay broken).

View solution in original post

Highlighted

Re: CSV Headers are listing as events and not extracting into interesting fields .

Path Finder

Thanks Woodcock.
I have universal forwarder on server end from where the logs are sent to the splunk instance(standalone) . Since UF are not full splunk instance , doesn't have capabilities to data parse . I'm trying to do index time field extractions .

I'm making changes props.conf changes on my indexer.

regards,
Guru.

0 Karma
Highlighted

Re: CSV Headers are listing as events and not extracting into interesting fields .

Esteemed Legend

You do not understand; the INDEXED_EXTRACTIONS function ONLY works on the forwarding node. You MUST put it there; it will do nothing anywhere else.

You have to put this on every Forwarder and then restart all splunk instances there; read about this caveat here:
http://docs.splunk.com/Documentation/Splunk/6.0/Data/Extractfieldsfromfileheadersatindextime#Caveats

Just try it.

0 Karma
Highlighted

Re: CSV Headers are listing as events and not extracting into interesting fields .

Path Finder

Thanks alot Woodcock . It works now .

Appreciate your answers .

0 Karma
Highlighted

Re: CSV Headers are listing as events and not extracting into interesting fields .

Path Finder

Hi Woodcock ,

Thanks for your guidance .

The previous props.conf setup at every forwarder node enabled header field extraction .

But now i see Header fields extracted into interesting Fields along with same header field listed as events along with the other events(rows) . I'm monitoring 4 identical csv files with similar header fields in it .

ex : Intresting fields > header " xyz" > values - mlm, mlx , mlr , xyz.

I have to query xyz!=xyz to remove them from the events .Plz suggest to remove the header fields form the events .

Thanks In Advance .
GURU.

0 Karma
Highlighted

Re: CSV Headers are listing as events and not extracting into interesting fields .

SplunkTrust
SplunkTrust

Is this for events that are newly indexed after the change, or for the ones indexed before that?

0 Karma
Highlighted

Re: CSV Headers are listing as events and not extracting into interesting fields .

Path Finder

DalJeanis ,

I see this happen with newly indexed data after the header field extraction .Plz let me know if you need any further info

Thanks

0 Karma
Highlighted

Re: CSV Headers are listing as events and not extracting into interesting fields .

Esteemed Legend

I suspect that what you are seeing is the old/bad events, merged with the new/correct ones. You can hide the old events with a search like this:

Your Search That Shows Headers Or Other Broken Events | delete
0 Karma
Highlighted

Re: CSV Headers are listing as events and not extracting into interesting fields .

Path Finder

Hi Woodcock,

Thanks you .

the rolling data comes from a shared drive from 10 different hosts which are acting as deployment clients , which has same identical data with same headers for each file.
I see that headers are being indexed from all the hosts and see equal number of events count as rows. ex : header event count : 70 , row events - 70 .

my inputs is like below :

monitor path : ///m3/logs/csv_*.log
sourcetype=jkj
index=kjkj
crcSalt=

0 Karma
Speak Up for Splunk Careers!

We want to better understand the impact Splunk experience and expertise has has on individuals' careers, and help highlight the growing demand for Splunk skills.