Good morning. I hope you can help.
I am currently trying to monitor specific files (in .csv format) that are updated every 10 minutes or so on a server. I have changed the inputs.conf file for the App in question and I can get the data in to Splunk.
The problem I have is that the headers appear within the raw event data and also it seems that the events are not separating correctly.
Here is an example of the raw event data (bold being the headers):
It seems though that after CLOSED, this should be a separated event, beginning again to match header with data.
Two questions, (1) how do I sort this so that the headers are removed/split from the event data so that I can normalise and extract fields to present in a table AND (2) how do I separate the attached events?
I have spent a huge amount of time with this so far and have not managed to make any progress so any help would be hugely appreciated.
Kind regards,
Rob.
All sorted.
Added the below to props.conf to remove headers from event and also to recognise .csv file format. The table headers where then showing in 'interesting fields' along with the separated data.
Many thanks for your help.
Rob.
DATETIME_CONFIG =
INDEXED_EXTRACTIONS = csv
KV_MODE = none
NO_BINARY_CHECK = true
SHOULD_LINEMERGE = false
category = Structured
description = Comma-separated value format. Set header and other settings in "Delimited Settings"
disabled = false
pulldown_type = true
All sorted.
Added the below to props.conf to remove headers from event and also to recognise .csv file format. The table headers where then showing in 'interesting fields' along with the separated data.
Many thanks for your help.
Rob.
DATETIME_CONFIG =
INDEXED_EXTRACTIONS = csv
KV_MODE = none
NO_BINARY_CHECK = true
SHOULD_LINEMERGE = false
category = Structured
description = Comma-separated value format. Set header and other settings in "Delimited Settings"
disabled = false
pulldown_type = true
Will do. Many thanks for your advice.
I'll try it now and see how I get on.
Hi - Thanks for your response.
I did not create the file server side but rather monitor a set of directories to monitor the .csv files. This is how they appear when indexed. We chose monitoring due to constant changes in files.
I have spoken with the team who look after the server on which we are monitoring these files and they have confirmed they are comma separated and the documents are saved as CSV.
Why is Splunk not noticing this? I guess it's something I am doing wrong....
OK maybe you want to start with eliminating that type header in the CSV?
https://answers.splunk.com/answers/49366/how-to-ignore-first-three-line-of-my-log.html
Also try open the file in excel and see how excel gets on with it, this will tell you if it's compliant with very loose CSV standards.
Hi, it does seem like the CSV is not formatted correctly?
Every line in a CSV should be terminated with a new line, even the header. That data I have copied and it is on one line...