Getting Data In

Set up log-to-metrics from Universal Forwarder to Splunk Enterprise

New Member

I've followed the docs for setting up log-to-metrics but I haven't been able to get it to work as intended.

I have a CSV file being monitored by a universal forwarder that then gets sent to Splunk enterprise. I want every value in the CSV (except for the date and time) to be saved as a metric in splunk, with a metric_name matching the field name and a _value from the CSV file. According to the documentation on, for structured data like a CSV, the configuration files should be located on the forwarder, which is what I have done.

Here is what I have in SplunkForwarder/etc/apps/search/local/props.conf:

METRIC-SCHEMA-TRANSFORMS = metric-schema:csv-logtometrics

In SplunkForwarder/etc/apps/search/local/transforms.conf:


And in SplunkForwarder/etc/apps/search/local/inputs.conf

sourcetype = csv-logtometrics
disabled = false

With this config, when I search in splunk I can get results for metric_name and _value, but they are only for the first csv column (Field1 in this case). How do I get values for the other csv columns to show up as metrics as well? My understanding was that using _ALLNUMS_ should cause each individual field in the csv to be read as a metric, but it appears that it is only applying to the first field.

I also haven't figured out how to get these searchable results into a metrics index, rather than just being searchable like a normal event log. I tried creating a matching metrics sourcetype on the splunk enterprise end, but that didn't seem to work. I get no results when running

| mcatalog values(metric_name)
0 Karma

Path Finder

Did you resolve this issue? I have a same problem...

0 Karma

New Member

No, I never found a way to make this work.

0 Karma
State of Splunk Careers

Access the Splunk Careers Report to see real data that shows how Splunk mastery increases your value and job satisfaction.

Find out what your skills are worth!