I've followed the docs for setting up log-to-metrics but I haven't been able to get it to work as intended.
I have a CSV file being monitored by a universal forwarder that then gets sent to Splunk enterprise. I want every value in the CSV (except for the date and time) to be saved as a metric in splunk, with a metric_name matching the field name and a _value from the CSV file. According to the documentation on https://docs.splunk.com/Documentation/Splunk/7.3.2/Metrics/L2MConfiguration, for structured data like a CSV, the configuration files should be located on the forwarder, which is what I have done.
Here is what I have in SplunkForwarder/etc/apps/search/local/props.conf:
[csv-logtometrics] FIELD_DELIMITER=, FIELD_NAMES=Date,Time,Field1,Field2,Field3 INDEXED_EXTRACTIONS = csv METRIC-SCHEMA-TRANSFORMS = metric-schema:csv-logtometrics
And in SplunkForwarder/etc/apps/search/local/inputs.conf
[monitor:///path/to/stats.csv] sourcetype = csv-logtometrics disabled = false
With this config, when I search in splunk I can get results for metric_name and _value, but they are only for the first csv column (Field1 in this case). How do I get values for the other csv columns to show up as metrics as well? My understanding was that using _ALLNUMS_ should cause each individual field in the csv to be read as a metric, but it appears that it is only applying to the first field.
I also haven't figured out how to get these searchable results into a metrics index, rather than just being searchable like a normal event log. I tried creating a matching metrics sourcetype on the splunk enterprise end, but that didn't seem to work. I get no results when running
| mcatalog values(metric_name)