I am using the universal forwarder(UF) to monitor a directory for a CSV file on a remote server. I have configured inputs.conf on the UF to monitor the dir. I am forwarding the data to a Heavy Forwarder which will then forward to an indexer cluster.
I want to tell Splunk where to find the time field and header line using a source type in props.conf
Which component in the distributed environment needs to have the source type configured? The UF, HF or indexer layer?
As you are mentioning CSV with INDEXED_EXTRACTIONS = CSV, then it goes on props.conf on the collector, so the UF. The events will not be reparsed again at the indexer level.
HEADER_FIELD_LINE_NUMBER = 1 is fine or you can let Splunk detect it...
I just used the Add Data feature for a
csv file and it shows -
I deleted all except
INDEXED_EXTRACTIONS = CSV and finished the upload. The data and all the fields are extracted and the generated stanza in
props.conf is surprisingly -
[csv_tst] DATETIME_CONFIG = INDEXED_EXTRACTIONS = csv KV_MODE = none NO_BINARY_CHECK = true SHOULD_LINEMERGE = false category = Structured description = Comma-separated value format. Set header and other settings in "Delimited Settings" disabled = false pulldown_type = true