I think I know the answer but just want to confirm it. I have a Universal Forwarder and want to extract a field from source and send it to the indexer. It's a regular log (not a CSV, PSV etc...) so I guess I cannot (?) use INDEXED_EXTRACTIONS.
sourcetype = mytest
EXTRACT-mytest= C:\test2\(?.+).log in source
If I add this settings to a non-forwarder splunk instance it works perfectly. I am able to extract mytest (testname in this example) variable from the source. If I copy the same settings to my universal forwarder it won't work, the data is forwarder but the field is not extracted. Do I have to convert it to a heavy forwarder? Or add these extractions to our indexers?
Based on the above, just making sure that Universal Forwarders are not able to extract fields from source unless they are certain file types like CSV, TSV etc... And if it is possible, can someone pass along an example with props ?
Field extraction using Splunk can be done only at parsing phase not on input phase expect for some types like CSV, JSON etc.
So either HF in between UF and Indexer which can do the work. OR do the direct extraction on Indexer.
inputs.conf file goes on the
Forwarder (the place that contains/pulls-in the data). Whenever you are using
props.conf, that needs to go on your
Search Head (not
Indexers as some have said).
Inputs.conf goes on your forwarders. props.conf goes on the indexers.
Are you sure that's correct? I am currently using INDEXED_EXTRACTIONS stanza on CSV files defined in props.conf on a Universal Forwarder. It's able to extract csv column names and assign them the values in each row.
disabled = false
sourcetype = csvdata
SHOULD_LINEMERGE = false
INDEXED_EXTRACTIONS = CSV
DELIM = ","
FIELDS = RunDate,FilerName,UserName,events_sub_Count
Yes, I'm sure. Newer UFs can do some parsing, but the general rule is to put props.conf on the indexer. You may also need to put it on your search heads if it contains search-time properties (like REPORT-*).