Getting Data In

CSV delimited with double quotes and seperated by comma. Header in every field

davem1984
New Member

Hello,

I'm having trouble importing Fortinet log CSV file delimited with double quotes and seperated by comma. The header is included in every field:

"itime=1381384446","date=2013-10-10","time=06:54:06","devid=FG800C1234501000","vd=root","type=traffic","subtype=forward"

The data is not interpreted properly, resulting in one field per line and automatically generated "interesting fields" which make no distinction between the header and value.

Every line has the same number of fields. Please help. Thanks.

0 Karma

ogdin
Splunk Employee
Splunk Employee

Even better, use INDEXED_EXTRACTIONS=CSV in props.conf. This should auto-detect the quotes around the events and headers and adjust accordingly. If not, there are controls in header-based index-time field extraction you can use to adjust like FIELD_QUOTE and HEADER_FIELD_QUOTE.

http://docs.splunk.com/Documentation/Splunk/latest/Data/Extractfieldsfromfileheadersatindextime

Note this only works in Splunk 6.

grijhwani
Motivator

That is not the correct, de facto, format for CSV. CSV under normal circumstances should have a header per column, and subsequent records of column-specific values. It seems an odd way to represent it. You would be better off stripping the quotes and simply importing it as a raw file. Splunk will, like as not, correctly interpret your fields with no further processing.

Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...