I have issue with index field which contain comma. Below is my csv input
"28650096","2013-12-02 20:30:30","blocked","porn\, sexual content","a@a.com","1.1.2.3"
"28650093","2013-12-02 20:30:30","allow","search site","b@b.com","2.2.2.4"
"28650092","2013-12-02 20:30:30","blocked","gambling","c@c.com","3.3.3.2"
my props.conf
[temp-audit]
FIELD_DELIMITER = ,
INDEXED_EXTRACTIONS = csv
KV_MODE = none
NO_BINARY_CHECK = 1
REPORT-audit = temp-audit-csv
SHOULD_LINEMERGE = false
pulldown_type = 1
my transforms.conf
[temp-audit-csv]
DELIMS=", "
FIELDS=id,timeStamp,Type,Reason,email,SourceIP
When add data using A file or directory of files it can see three events without problem. But after done adding data when in search when I do "search *" it only return 2 events it seem the first one didn't make it to the search.
Please help
thanks
INDEXED_EXTRACTIONS is attempting to use the first line as the header (the column/field names). That is why you are only seeing 3 events. You can use DELIMS here but in Splunk 6 we introduced:
INDEXED_EXTRACTIONS = csv
specifically so you don't have to define the fields in DELIMS. We attempt to automatically read the first line of the CSV (usually the header) and create index-time fields.
So, if the file has no header, you can use INDEXED_EXTRACTIONS = csv with the
FIELD_NAMES option:
http://docs.splunk.com/Documentation/Splunk/latest/Data/Extractfieldsfromfileheadersatindextime
OR use props/transforms with DELIMS but you cannot mix the two.
Hello,
I could see you have a "," in the extracted field value. So it's better not to confuse splunk and keep it simple. So just let splunk decide all the things. Below config should work for you.
[temp-audit]
KV_MODE = none
NO_BINARY_CHECK = 1
REPORT-audit = temp-audit-csv
SHOULD_LINEMERGE = false
pulldown_type = 1
[temp-audit-csv]
DELIMS=","
FIELDS="id","timeStamp","Type","Reason","email","SourceIP"
Thanks
yes you need to re-index it. You need to clear out the fishbucket from your universal forwarder. Please follow the link
_http://answers.splunk.com/answers/54070/btprobe-and-re-indexing-data
or you can just change the file name it will re-index it.
I tried the above solution and it does not work. I believe transforms.conf is for csv column mapping. This is at indexing time which happen before transforms.csv. Anyhow here is what I did:
splunk stop
splunk clean all
splunk start
add data source and index it. Of course at this time the transforms.conf does not exist yet. After index only two events showed up.
splunk stop
//add transforms.conf and modified props.conf with recommended above solutions
splunk start
Perform search and still only two events showed up.
Do I need to re-index, if yes how do i do that