Splunk Search

parse mv json into multiline chart

brianbcampbell
Engager

 

 

I have a field named Msg which contains json. That json contains some values and an array. I need to get each item from the array and put it on its own line (line chart line) and also get one of the header values as a line.

So on my line chart I want a line for each of:  totalSorsTime, internalProcessingTime, remote_a, remote_b, etc

The closest I can get is this-

 

index=wdpr_S0001469 source="*-vas-latest*" "Orchestration Summary"
| spath input=Msg    <<<< Msg field contains the json
| table _time, totalTime, totalSorsTime, internalProcessingTime, sorMetrics{}.sor, sorMetrics{}.executionTimeMs

 

 

Any nudge in the right direction would be greatly appreciated!

 

 

{
  "totalTime": 2820,
  "totalSorsTime": 1505,
  "internalProcessingTime": 1315,
  "sorMetrics": [
    {
      "sor": "remote_a",
      "executionTimeMs": 77
    },
    {
      "sor": "remote_b",
      "executionTimeMs": 27
    },
    {
      "sor": "remote_c",
      "executionTimeMs": 759
    },
    {
      "sor": "remote_d",
      "executionTimeMs": 199
    },
    {
      "sor": "remote_e",
      "executionTimeMs": 85
    },
    {
      "sor": "remote_f",
      "executionTimeMs": 252
    }
  ]
}

 

 

 

 

 

 


 

Labels (3)
0 Karma
1 Solution

ITWhisperer
SplunkTrust
SplunkTrust

Your example had duplicate entries for remote_d so I changed that - if you have duplicates in your real data, you will end up with some mv fields. As usual, the part before the blank lines just sets up some sample data.

| makeresults
| eval msg="{
  \"totalTime\": 2820,
  \"totalSorsTime\": 1505,
  \"internalProcessingTime\": 1315,
  \"sorMetrics\": [
    {
      \"sor\": \"remote_a\",
      \"executionTimeMs\": 77
    },
    {
      \"sor\": \"remote_b\",
      \"executionTimeMs\": 27
    },
    {
      \"sor\": \"remote_c\",
      \"executionTimeMs\": 759
    },
    {
      \"sor\": \"remote_d\",
      \"executionTimeMs\": 199
    },
    {
      \"sor\": \"remote_e\",
      \"executionTimeMs\": 106
    },
    {
      \"sor\": \"remote_f\",
      \"executionTimeMs\": 85
    },
    {
      \"sor\": \"remote_g\",
      \"executionTimeMs\": 252
    }
  ]
}"



| spath input=msg path="totalSorsTime"
| spath input=msg path="internalProcessingTime"
| spath input=msg path="sorMetrics{}" output="sorMetrics"
| streamstats count as _row 
| mvexpand sorMetrics
| spath input=sorMetrics
| eval {sor}=executionTimeMs
| fields - msg sorMetrics sor executionTimeMs
| stats values(*) as * by _row
| table *

View solution in original post

ITWhisperer
SplunkTrust
SplunkTrust

Your example had duplicate entries for remote_d so I changed that - if you have duplicates in your real data, you will end up with some mv fields. As usual, the part before the blank lines just sets up some sample data.

| makeresults
| eval msg="{
  \"totalTime\": 2820,
  \"totalSorsTime\": 1505,
  \"internalProcessingTime\": 1315,
  \"sorMetrics\": [
    {
      \"sor\": \"remote_a\",
      \"executionTimeMs\": 77
    },
    {
      \"sor\": \"remote_b\",
      \"executionTimeMs\": 27
    },
    {
      \"sor\": \"remote_c\",
      \"executionTimeMs\": 759
    },
    {
      \"sor\": \"remote_d\",
      \"executionTimeMs\": 199
    },
    {
      \"sor\": \"remote_e\",
      \"executionTimeMs\": 106
    },
    {
      \"sor\": \"remote_f\",
      \"executionTimeMs\": 85
    },
    {
      \"sor\": \"remote_g\",
      \"executionTimeMs\": 252
    }
  ]
}"



| spath input=msg path="totalSorsTime"
| spath input=msg path="internalProcessingTime"
| spath input=msg path="sorMetrics{}" output="sorMetrics"
| streamstats count as _row 
| mvexpand sorMetrics
| spath input=sorMetrics
| eval {sor}=executionTimeMs
| fields - msg sorMetrics sor executionTimeMs
| stats values(*) as * by _row
| table *

brianbcampbell
Engager

It says "Legend" next to your name and it is a deserved title! Thank you!

 

 

0 Karma
Get Updates on the Splunk Community!

Unlock Database Monitoring with Splunk Observability Cloud

  In today’s fast-paced digital landscape, even minor database slowdowns can disrupt user experiences and ...

Purpose in Action: How Splunk Is Helping Power an Inclusive Future for All

At Cisco, purpose isn’t a tagline—it’s a commitment. Cisco’s FY25 Purpose Report outlines how the company is ...

[Upcoming Webinar] Demo Day: Transforming IT Operations with Splunk

Join us for a live Demo Day at the Cisco Store on January 21st 10:00am - 11:00am PST In the fast-paced world ...