Knowledge Management

How to format a metrics_csv file for a metrics index?

jwillaime
Explorer

Hello,

I would like to know what kind of format does a CSV file need to follow in order to feed it to a metric index.

Does it need a header with the mandatory fields? Or do you put the fields inline with the data? Is there an example somewhere?

Also I would like to know which configuration should be applied to the universal forwarder to monitor this file and send the data to the metric indexer.

Thank you in advance!

0 Karma
1 Solution

bsonposh
Communicator

IFAICT
You are required to have metric_timestamp, metric_name and _value... dimensions are optional

sample format: process_object_guid is a dimension
"metric_timestamp","metric_name","_value","process_object_guid"
"1509997011","process.cpu.avg","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.cpu.min","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.cpu.max","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.cpu.last","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.ram.avg","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.ram.min","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.ram.max","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.ram.last","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.disk.avg","38750","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.disk.min","38750","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.disk.max","38750","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.disk.last","38750","dbd1414b-378e-48bd-9735-bc2bab1e58fa”

Second part of the question:

On the Universal Forwarder (UF)

inputs.conf
[monitor:///opt/metrics_data]
index = metrics
sourcetype = metrics_csv

On the Indexer

indexes.conf
[metrics]
homePath = $SPLUNK_DB/metrics/db
coldPath = $SPLUNK_DB/metrics/colddb
thawedPath = $SPLUNK_DB/metrics/thaweddb
datatype = metric
maxTotalDataSizeMB = 512000

p.s. I submitted a docs bug for clarity.

View solution in original post

bsonposh
Communicator

IFAICT
You are required to have metric_timestamp, metric_name and _value... dimensions are optional

sample format: process_object_guid is a dimension
"metric_timestamp","metric_name","_value","process_object_guid"
"1509997011","process.cpu.avg","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.cpu.min","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.cpu.max","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.cpu.last","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.ram.avg","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.ram.min","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.ram.max","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.ram.last","2563454144","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.disk.avg","38750","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.disk.min","38750","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.disk.max","38750","dbd1414b-378e-48bd-9735-bc2bab1e58fa"
"1509997011","process.disk.last","38750","dbd1414b-378e-48bd-9735-bc2bab1e58fa”

Second part of the question:

On the Universal Forwarder (UF)

inputs.conf
[monitor:///opt/metrics_data]
index = metrics
sourcetype = metrics_csv

On the Indexer

indexes.conf
[metrics]
homePath = $SPLUNK_DB/metrics/db
coldPath = $SPLUNK_DB/metrics/colddb
thawedPath = $SPLUNK_DB/metrics/thaweddb
datatype = metric
maxTotalDataSizeMB = 512000

p.s. I submitted a docs bug for clarity.

mattness
Splunk Employee
Splunk Employee

This was a great example of a metrics input from a CSV file! We've added it to the metrics documentation:
http://docs.splunk.com/Documentation/Splunk/7.0.0/Metrics/GetMetricsInOther#Example_of_a_CSV_file_me...

We have also updated this topic to clarify the CSV file format requirements for this kind of metrics input.

Get Updates on the Splunk Community!

Enter the Agentic Era with Splunk AI Assistant for SPL 1.4

  🚀 Your data just got a serious AI upgrade — are you ready? Say hello to the Agentic Era with the ...

Stronger Security with Federated Search for S3, GCP SQL & Australian Threat ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Accelerating Observability as Code with the Splunk AI Assistant

We’ve seen in previous posts what Observability as Code (OaC) is and how it’s now essential for managing ...