Knowledge Management

How to format a metrics_csv file for a metrics index?

jwillaime
Explorer

Hello,

I would like to know what kind of format does a CSV file need to follow in order to feed it to a metric index.

Does it need a header with the mandatory fields? Or do you put the fields inline with the data? Is there an example somewhere?

Also I would like to know which configuration should be applied to the universal forwarder to monitor this file and send the data to the metric indexer.

Thank you in advance!

0 Karma
1 Solution

bsonposh
Communicator

IFAICT
You are required to have metric_timestamp, metric_name and _value... dimensions are optional

sample format: process_object_guid is a dimension
"metric_timestamp","metric_name","_value","process_object_guid"
"1509997011","process.cpu.avg","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.cpu.min","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.cpu.max","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.cpu.last","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.ram.avg","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.ram.min","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.ram.max","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.ram.last","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.disk.avg","38750","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.disk.min","38750","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.disk.max","38750","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.disk.last","38750","dbd1414b-378e-48bd-9735-bc2bab1e58fa”

Second part of the question:

On the Universal Forwarder (UF)

inputs.conf
[monitor:///opt/metrics_data]
index = metrics
sourcetype = metrics_csv

On the Indexer

indexes.conf
[metrics]
homePath = $SPLUNK_DB/metrics/db
coldPath = $SPLUNK_DB/metrics/colddb
thawedPath = $SPLUNK_DB/metrics/thaweddb
datatype = metric
maxTotalDataSizeMB = 512000

p.s. I submitted a docs bug for clarity.

View solution in original post

bsonposh
Communicator

IFAICT
You are required to have metric_timestamp, metric_name and _value... dimensions are optional

sample format: process_object_guid is a dimension
"metric_timestamp","metric_name","_value","process_object_guid"
"1509997011","process.cpu.avg","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.cpu.min","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.cpu.max","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.cpu.last","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.ram.avg","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.ram.min","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.ram.max","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.ram.last","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.disk.avg","38750","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.disk.min","38750","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.disk.max","38750","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.disk.last","38750","dbd1414b-378e-48bd-9735-bc2bab1e58fa”

Second part of the question:

On the Universal Forwarder (UF)

inputs.conf
[monitor:///opt/metrics_data]
index = metrics
sourcetype = metrics_csv

On the Indexer

indexes.conf
[metrics]
homePath = $SPLUNK_DB/metrics/db
coldPath = $SPLUNK_DB/metrics/colddb
thawedPath = $SPLUNK_DB/metrics/thaweddb
datatype = metric
maxTotalDataSizeMB = 512000

p.s. I submitted a docs bug for clarity.

mattness
Splunk Employee
Splunk Employee

This was a great example of a metrics input from a CSV file! We've added it to the metrics documentation:
http://docs.splunk.com/Documentation/Splunk/7.0.0/Metrics/GetMetricsInOther#Example_of_a_CSV_file_me...

We have also updated this topic to clarify the CSV file format requirements for this kind of metrics input.

Get Updates on the Splunk Community!

Modern way of developing distributed application using OTel

Recently, I had the opportunity to work on a complex microservice using Spring boot and Quarkus to develop a ...

Enterprise Security Content Update (ESCU) | New Releases

Last month, the Splunk Threat Research Team had 3 releases of new security content via the Enterprise Security ...

Archived Metrics Now Available for APAC and EMEA realms

We’re excited to announce the launch of Archived Metrics in Splunk Infrastructure Monitoring for our customers ...