<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Parsing multiple values in fields from JSON key value format in Splunk Search</title>
    <link>https://community.splunk.com/t5/Splunk-Search/Parsing-multiple-values-in-fields-from-JSON-key-value-format/m-p/684576#M233674</link>
    <description>&lt;P&gt;Before I rant, thank you for sharing valid mock data in text. &amp;nbsp;This said, this is the second time in as many consecutive days I feel like screaming at lazy developers who makes some terrible use of JSON arrays. (The developer might be you. &amp;nbsp;But the rant stands&lt;span class="lia-unicode-emoji" title=":smirking_face:"&gt;😏&lt;/span&gt;) &amp;nbsp;Your data would have much cleaner, self-evidenced semantics had the developer simply use this:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;[
 {
  "attributes": {"host.name":{"stringValue":"myname1"},"telemetry.sdk.name":{"stringValue":"my_sdk"}},
  "metrics": {"hw.host.energy":{"dataPoints":[{"timeUnixNano":"1712951030986039000","asDouble":359}]},"hw.host.power":{"dataPoints":[{"timeUnixNano":"1712951030986039000","asDouble":26}]}}
 },
 {
  "attributes": {"host.name":{"stringValue":"myname2"},"telemetry.sdk.name":{"stringValue":"my_sdk"}},
  "metrics": {"hw.host.energy":{"dataPoints":[{"timeUnixNano":"1712951030987780000","asDouble":211}]}}
]&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;In other words, only two JSON arrays in the original data are used correctly. &amp;nbsp;&lt;U&gt;resourceMetrics.resource.attributes[]&lt;/U&gt; and &lt;U&gt;resourceMetrics.scopeMetrics.metrics[]&lt;/U&gt; are total abomination of the intent of JSON arrays. &amp;nbsp;Speak to your developers to see if they could change the data structure not just for Splunk, but for future maintainers of their own code and any other downstream team as well.&lt;/P&gt;&lt;P&gt;Now that this is off my chest, I understand that it will take more than one day for developers to change code even if you convince them on day one. &amp;nbsp;Here is the SPL that I use to tabulate your data like the following:&lt;/P&gt;&lt;TABLE&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD width="170.265625px"&gt;host.name.stringValue&lt;/TD&gt;&lt;TD width="203.265625px"&gt;hw.host.energy{}.asDouble&lt;/TD&gt;&lt;TD width="237.5625px"&gt;hw.host.energy{}.timeUnixNano&lt;/TD&gt;&lt;TD width="40px"&gt;hw.host.power{}.asDouble&lt;/TD&gt;&lt;TD width="193.40625px"&gt;hw.host.power{}.timeUnixNano&lt;/TD&gt;&lt;TD width="67.421875px"&gt;sdk.name.stringValue&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="170.265625px"&gt;myname1&lt;/TD&gt;&lt;TD width="203.265625px"&gt;359&lt;/TD&gt;&lt;TD width="237.5625px"&gt;1712951030986039000&lt;/TD&gt;&lt;TD width="40px"&gt;26&lt;/TD&gt;&lt;TD width="193.40625px"&gt;1712951030986039000&lt;/TD&gt;&lt;TD width="67.421875px"&gt;my_sdk&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="170.265625px"&gt;myname2&lt;/TD&gt;&lt;TD width="203.265625px"&gt;211&lt;/TD&gt;&lt;TD width="237.5625px"&gt;1712951030987780000&lt;/TD&gt;&lt;TD width="40px"&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD width="193.40625px"&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD width="67.421875px"&gt;my_sdk&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;In this form, I have assumed that dataPoints[] is the only node of interest under resourceMetrics[].scopeMetrics[].metrics.gauge&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;| spath path=resourceMetrics{}
| fields - _* resourceMetrics{}.*
| mvexpand resourceMetrics{}
| spath input=resourceMetrics{} path=resource.attributes{}
| spath input=resourceMetrics{} path=scopeMetrics{}
| spath input=scopeMetrics{} path=metrics{}
| fields - resourceMetrics{} scopeMetrics{}
| foreach resource.attributes{} mode=multivalue
    [eval key = mvappend(key, json_extract(&amp;lt;&amp;lt;ITEM&amp;gt;&amp;gt;, "key"))]
| eval idx = mvrange(0, mvcount(key))
| eval attributes_good = json_object()
| foreach idx mode=multivalue
    [eval attribute = mvindex('resource.attributes{}', &amp;lt;&amp;lt;ITEM&amp;gt;&amp;gt;),
    attributes_good = json_set_exact(attributes_good, json_extract(attribute, "key"), json_extract(attribute, "value"))]
| fields - key attribute resource.attributes{}
| foreach metrics{} mode=multivalue
    [eval name = mvappend(name, json_extract(&amp;lt;&amp;lt;ITEM&amp;gt;&amp;gt;, "name"))]
| eval name = if(isnull(name), json_extract('metrics{}', "name"), name)
| eval idx = mvrange(0, mvcount(name))
| eval metrics_good = json_object()
| foreach idx mode=multivalue
    [eval metric = mvindex('metrics{}', &amp;lt;&amp;lt;ITEM&amp;gt;&amp;gt;),
    metrics_good = json_set_exact(metrics_good, json_extract(metric, "name"), json_extract(metric, "gauge.dataPoints"))]
    ``` the above assumes that gauge.dataPoints is the only subnode of interest ```
| fields - idx name metric metrics{}
``` the above transforms array-laden JSON into easily understandable JSON ```
| spath input=attributes_good
| spath input=metrics_good
| fields - *_good
``` the following is only needed if dataPoints[] actually contain multiple values.  This is the only code requiring prior knowledge about data fields ```
| mvexpand hw.host.energy{}.timeUnixNano
| mvexpand hw.host.power{}.timeUnixNano&lt;/LI-CODE&gt;&lt;P&gt;(The &lt;FONT face="courier new,courier"&gt;fields - xxx&lt;/FONT&gt; commands are not essential; they just declutter view.) &amp;nbsp;Hope this helps.&lt;/P&gt;&lt;P&gt;This is an emulation you can play with and compare with real data:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;| makeresults
| eval _raw = "{
  \"resourceMetrics\": [
    {
      \"resource\": {
        \"attributes\": [
          {
            \"key\": \"host.name\",
            \"value\": {
              \"stringValue\": \"myname1\"
            }
          },
          {
            \"key\": \"telemetry.sdk.name\",
            \"value\": {
              \"stringValue\": \"my_sdk\"
            }
          }
        ]
      },
      \"scopeMetrics\": [
        {
          \"metrics\": [
            {
              \"name\": \"hw.host.energy\",
              \"gauge\": {
                \"dataPoints\": [
                  {
                    \"timeUnixNano\": \"1712951030986039000\",
                    \"asDouble\": 359
                  }
                ]
              }
            },
                    {
              \"name\": \"hw.host.power\",
              \"gauge\": {
                \"dataPoints\": [
                  {
                    \"timeUnixNano\": \"1712951030986039000\",
                    \"asDouble\": 26
                  }
                ]
              }
            }
          ]
        }
      ]
    },
    {
      \"resource\": {
        \"attributes\": [
          {
            \"key\": \"host.name\",
            \"value\": {
              \"stringValue\": \"myname2\"
            }
          },
          {
            \"key\": \"telemetry.sdk.name\",
            \"value\": {
              \"stringValue\": \"my_sdk\"
            }
          }
        ]
      },
      \"scopeMetrics\": [
        {
          \"metrics\": [
            {
              \"name\": \"hw.host.energy\",
              \"gauge\": {
                \"dataPoints\": [
                  {
                    \"timeUnixNano\": \"1712951030987780000\",
                    \"asDouble\": 211
                  }
                ]
              }
            }
          ]
        }
      ]
    }
  ]
}"
| spath
``` data emulation above ```&lt;/LI-CODE&gt;&lt;P&gt;Final thoughts about data structure with self-evidence semantics: If my speculation about&amp;nbsp;dataPoints[] being the only node of interest under resourceMetrics[].scopeMetrics[].metrics.gauge stands, good data could be further simplified to&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;[
 {
  "attributes": {"host.name":{"stringValue":"myname1"},"telemetry.sdk.name":{"stringValue":"my_sdk"}},
  "metrics": {"hw.host.energy":[{"timeUnixNano":"1712951030986039000","asDouble":359}],"hw.host.power":[{"timeUnixNano":"1712951030986039000","asDouble":26}]}
 },
 {
  "attributes": {"host.name":{"stringValue":"myname2"},"telemetry.sdk.name":{"stringValue":"my_sdk"}},
  "metrics": {"hw.host.energy":[{"timeUnixNano":"1712951030987780000","asDouble":211}]}
]&lt;/LI-CODE&gt;&lt;P&gt;I do understand that listing hw.host.energy and hw.host.power as coexisting columns is different from your illustrated output and may not suite your needs. &amp;nbsp;But presentation can easily be adapted. &amp;nbsp;Bad data structure remains bad.&lt;/P&gt;</description>
    <pubDate>Wed, 17 Apr 2024 22:18:00 GMT</pubDate>
    <dc:creator>yuanliu</dc:creator>
    <dc:date>2024-04-17T22:18:00Z</dc:date>
    <item>
      <title>Parsing multiple values in fields from JSON key value format</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Parsing-multiple-values-in-fields-from-JSON-key-value-format/m-p/684554#M233668</link>
      <description>&lt;P&gt;I have some JSON output that is in key value structure (protobuf3 formatted--this is OTLP data going into Splunk Enterprise events) and it has multiple values in each field. There are multiple key value attributes stored under an attributes parent, and then its fields are under a metric parent. I want to take the host.name attribute and map it to every metrics value I see.&lt;/P&gt;&lt;P&gt;Here is working example of the raw json:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="java"&gt;{
  "resourceMetrics": [
    {
      "resource": {
        "attributes": [
          {
            "key": "host.name",
            "value": {
              "stringValue": "myname1"
            }
          },
          {
            "key": "telemetry.sdk.name",
            "value": {
              "stringValue": "my_sdk"
            }
          }
        ]
      },
      "scopeMetrics": [
        {
          "metrics": [
            {
              "name": "hw.host.energy",
              "gauge": {
                "dataPoints": [
                  {
                    "timeUnixNano": "1712951030986039000",
                    "asDouble": 359
                  }
                ]
              }
            },
                    {
              "name": "hw.host.power",
              "gauge": {
                "dataPoints": [
                  {
                    "timeUnixNano": "1712951030986039000",
                    "asDouble": 26
                  }
                ]
              }
            }
          ]
        }
      ]
    },
    {
      "resource": {
        "attributes": [
          {
            "key": "host.name",
            "value": {
              "stringValue": "myname2"
            }
          },
          {
            "key": "telemetry.sdk.name",
            "value": {
              "stringValue": "my_sdk"
            }
          }
        ]
      },
      "scopeMetrics": [
        {
          "metrics": [
            {
              "name": "hw.host.energy",
              "gauge": {
                "dataPoints": [
                  {
                    "timeUnixNano": "1712951030987780000",
                    "asDouble": 211
                  }
                ]
              }
            }
          ]
        }
      ]
    }
  ]
}&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;There may be multiple attributes, in various order, but I am only interested in grabbing the &lt;STRONG&gt;host.name &lt;/STRONG&gt;value from there, and then associating &lt;STRONG&gt;host.name&lt;/STRONG&gt; to all metrics under the &lt;STRONG&gt;metrics&lt;/STRONG&gt; parent within the &lt;STRONG&gt;resource&lt;/STRONG&gt; parent. The metrics parent may contain multiple metrics in the array. And then new resources (with new &lt;STRONG&gt;host.name&lt;/STRONG&gt; and new &lt;STRONG&gt;metrics&lt;/STRONG&gt;) would show up as the next &lt;STRONG&gt;resource&lt;/STRONG&gt; entry in the &lt;STRONG&gt;resources&lt;/STRONG&gt; array.&lt;/P&gt;&lt;P&gt;So what I want is something like this in a row-based format of host.name.value &amp;gt; metric:&lt;/P&gt;&lt;TABLE border="1" width="100%"&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD width="50%"&gt;&lt;STRONG&gt;host.name&lt;/STRONG&gt;&lt;/TD&gt;&lt;TD width="50%"&gt;&lt;STRONG&gt;metric&lt;/STRONG&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="50%"&gt;host.name,myname1&lt;/TD&gt;&lt;TD width="50%" height="25px"&gt;hw.host.energy,359&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="50%"&gt;host.name,myname1&lt;/TD&gt;&lt;TD width="50%" height="25px"&gt;hw.host.power,26&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="50%"&gt;host.name,myname2&lt;/TD&gt;&lt;TD width="50%"&gt;hw.host.energy,211&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The problem I am having is I don't want the other attributes from the attribute parent, which in the example is the telemetry.sdk.name key and value. But since they are there, I can't figure out how to zip and expand properly, as the&amp;nbsp;telemetry.sdk.name value gets associated to legit metrics, looking something like below, which would mean if I drop row 2 I lose the power metric = 26 for myname1.&lt;/P&gt;&lt;P&gt;Parsing some spaths, the structure looks something like this:&lt;/P&gt;&lt;TABLE border="1" width="100%"&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD width="50%" height="25px"&gt;&lt;STRONG&gt;attr_zip&lt;/STRONG&gt;&lt;/TD&gt;&lt;TD width="50%" height="25px"&gt;&lt;STRONG&gt;metric_zip&lt;/STRONG&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="50%" height="25px"&gt;host.name,myname1&lt;/TD&gt;&lt;TD width="50%" height="25px"&gt;hw.host.energy,359&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="50%" height="25px"&gt;telemetry.sdk.name,my_sdk&lt;/TD&gt;&lt;TD width="50%" height="25px"&gt;hw.host.power,26&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="50%"&gt;host.name,myname2&lt;/TD&gt;&lt;TD width="50%"&gt;hw.host.energy,211&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="50%" height="25px"&gt;telemetry.sdk.name,my_sdk&lt;/TD&gt;&lt;TD width="50%" height="25px"&gt;&amp;nbsp;&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I looked at mvfilter but can't seem to find a way to handle a variable amount of attributes that may show up in the left column attr_zip, as it seems I ned to know how many values I fill down in the field, and I am not sure how to get a count of the values fro the right column metric_zip to know how many values down in attr_zip to fill.&lt;/P&gt;&lt;P&gt;In JSON, all the &lt;STRONG&gt;metrics&lt;/STRONG&gt; values share the same &lt;STRONG&gt;resource &lt;/STRONG&gt;so I should logically be able to reference the parent resource.attribute.host.name.value, and concatenate that to every metric value.&lt;/P&gt;&lt;P&gt;Here's my current SPL, where I can get the columns concatenated properly, but would need to drop the rows in attr_zip that don't match the key of host.name:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;| spath output=host_name path=resourceMetrics{}.resource.attributes{} 
| mvexpand host_name 
| spath output=attribute path=resourceMetrics{}.resource.attributes{}.key
| spath output=attribute_value path=resourceMetrics{}.resource.attributes{}.value.stringValue
| spath output=time resourceMetrics{}.scopeMetrics{}.metrics{}.gauge.dataPoints{}.timeUnixNano
| spath output=metric_name resourceMetrics{}.scopeMetrics{}.metrics{}.name
| spath output=metric_value resourceMetrics{}.scopeMetrics{}.metrics{}.gauge.dataPoints{}.asDouble
| eval attr_zip=mvzip(attribute, attribute_value)
| eval metric_zip=mvzip(metric_name, metric_value)
| table attribute,attribute_value, attr_zip, metric_zip&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Anyone able to offer some guidance?&lt;/P&gt;</description>
      <pubDate>Wed, 17 Apr 2024 18:48:08 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Parsing-multiple-values-in-fields-from-JSON-key-value-format/m-p/684554#M233668</guid>
      <dc:creator>sholl</dc:creator>
      <dc:date>2024-04-17T18:48:08Z</dc:date>
    </item>
    <item>
      <title>Re: Parsing multiple values in fields from JSON key value format</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Parsing-multiple-values-in-fields-from-JSON-key-value-format/m-p/684576#M233674</link>
      <description>&lt;P&gt;Before I rant, thank you for sharing valid mock data in text. &amp;nbsp;This said, this is the second time in as many consecutive days I feel like screaming at lazy developers who makes some terrible use of JSON arrays. (The developer might be you. &amp;nbsp;But the rant stands&lt;span class="lia-unicode-emoji" title=":smirking_face:"&gt;😏&lt;/span&gt;) &amp;nbsp;Your data would have much cleaner, self-evidenced semantics had the developer simply use this:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;[
 {
  "attributes": {"host.name":{"stringValue":"myname1"},"telemetry.sdk.name":{"stringValue":"my_sdk"}},
  "metrics": {"hw.host.energy":{"dataPoints":[{"timeUnixNano":"1712951030986039000","asDouble":359}]},"hw.host.power":{"dataPoints":[{"timeUnixNano":"1712951030986039000","asDouble":26}]}}
 },
 {
  "attributes": {"host.name":{"stringValue":"myname2"},"telemetry.sdk.name":{"stringValue":"my_sdk"}},
  "metrics": {"hw.host.energy":{"dataPoints":[{"timeUnixNano":"1712951030987780000","asDouble":211}]}}
]&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;In other words, only two JSON arrays in the original data are used correctly. &amp;nbsp;&lt;U&gt;resourceMetrics.resource.attributes[]&lt;/U&gt; and &lt;U&gt;resourceMetrics.scopeMetrics.metrics[]&lt;/U&gt; are total abomination of the intent of JSON arrays. &amp;nbsp;Speak to your developers to see if they could change the data structure not just for Splunk, but for future maintainers of their own code and any other downstream team as well.&lt;/P&gt;&lt;P&gt;Now that this is off my chest, I understand that it will take more than one day for developers to change code even if you convince them on day one. &amp;nbsp;Here is the SPL that I use to tabulate your data like the following:&lt;/P&gt;&lt;TABLE&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD width="170.265625px"&gt;host.name.stringValue&lt;/TD&gt;&lt;TD width="203.265625px"&gt;hw.host.energy{}.asDouble&lt;/TD&gt;&lt;TD width="237.5625px"&gt;hw.host.energy{}.timeUnixNano&lt;/TD&gt;&lt;TD width="40px"&gt;hw.host.power{}.asDouble&lt;/TD&gt;&lt;TD width="193.40625px"&gt;hw.host.power{}.timeUnixNano&lt;/TD&gt;&lt;TD width="67.421875px"&gt;sdk.name.stringValue&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="170.265625px"&gt;myname1&lt;/TD&gt;&lt;TD width="203.265625px"&gt;359&lt;/TD&gt;&lt;TD width="237.5625px"&gt;1712951030986039000&lt;/TD&gt;&lt;TD width="40px"&gt;26&lt;/TD&gt;&lt;TD width="193.40625px"&gt;1712951030986039000&lt;/TD&gt;&lt;TD width="67.421875px"&gt;my_sdk&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="170.265625px"&gt;myname2&lt;/TD&gt;&lt;TD width="203.265625px"&gt;211&lt;/TD&gt;&lt;TD width="237.5625px"&gt;1712951030987780000&lt;/TD&gt;&lt;TD width="40px"&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD width="193.40625px"&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD width="67.421875px"&gt;my_sdk&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;In this form, I have assumed that dataPoints[] is the only node of interest under resourceMetrics[].scopeMetrics[].metrics.gauge&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;| spath path=resourceMetrics{}
| fields - _* resourceMetrics{}.*
| mvexpand resourceMetrics{}
| spath input=resourceMetrics{} path=resource.attributes{}
| spath input=resourceMetrics{} path=scopeMetrics{}
| spath input=scopeMetrics{} path=metrics{}
| fields - resourceMetrics{} scopeMetrics{}
| foreach resource.attributes{} mode=multivalue
    [eval key = mvappend(key, json_extract(&amp;lt;&amp;lt;ITEM&amp;gt;&amp;gt;, "key"))]
| eval idx = mvrange(0, mvcount(key))
| eval attributes_good = json_object()
| foreach idx mode=multivalue
    [eval attribute = mvindex('resource.attributes{}', &amp;lt;&amp;lt;ITEM&amp;gt;&amp;gt;),
    attributes_good = json_set_exact(attributes_good, json_extract(attribute, "key"), json_extract(attribute, "value"))]
| fields - key attribute resource.attributes{}
| foreach metrics{} mode=multivalue
    [eval name = mvappend(name, json_extract(&amp;lt;&amp;lt;ITEM&amp;gt;&amp;gt;, "name"))]
| eval name = if(isnull(name), json_extract('metrics{}', "name"), name)
| eval idx = mvrange(0, mvcount(name))
| eval metrics_good = json_object()
| foreach idx mode=multivalue
    [eval metric = mvindex('metrics{}', &amp;lt;&amp;lt;ITEM&amp;gt;&amp;gt;),
    metrics_good = json_set_exact(metrics_good, json_extract(metric, "name"), json_extract(metric, "gauge.dataPoints"))]
    ``` the above assumes that gauge.dataPoints is the only subnode of interest ```
| fields - idx name metric metrics{}
``` the above transforms array-laden JSON into easily understandable JSON ```
| spath input=attributes_good
| spath input=metrics_good
| fields - *_good
``` the following is only needed if dataPoints[] actually contain multiple values.  This is the only code requiring prior knowledge about data fields ```
| mvexpand hw.host.energy{}.timeUnixNano
| mvexpand hw.host.power{}.timeUnixNano&lt;/LI-CODE&gt;&lt;P&gt;(The &lt;FONT face="courier new,courier"&gt;fields - xxx&lt;/FONT&gt; commands are not essential; they just declutter view.) &amp;nbsp;Hope this helps.&lt;/P&gt;&lt;P&gt;This is an emulation you can play with and compare with real data:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;| makeresults
| eval _raw = "{
  \"resourceMetrics\": [
    {
      \"resource\": {
        \"attributes\": [
          {
            \"key\": \"host.name\",
            \"value\": {
              \"stringValue\": \"myname1\"
            }
          },
          {
            \"key\": \"telemetry.sdk.name\",
            \"value\": {
              \"stringValue\": \"my_sdk\"
            }
          }
        ]
      },
      \"scopeMetrics\": [
        {
          \"metrics\": [
            {
              \"name\": \"hw.host.energy\",
              \"gauge\": {
                \"dataPoints\": [
                  {
                    \"timeUnixNano\": \"1712951030986039000\",
                    \"asDouble\": 359
                  }
                ]
              }
            },
                    {
              \"name\": \"hw.host.power\",
              \"gauge\": {
                \"dataPoints\": [
                  {
                    \"timeUnixNano\": \"1712951030986039000\",
                    \"asDouble\": 26
                  }
                ]
              }
            }
          ]
        }
      ]
    },
    {
      \"resource\": {
        \"attributes\": [
          {
            \"key\": \"host.name\",
            \"value\": {
              \"stringValue\": \"myname2\"
            }
          },
          {
            \"key\": \"telemetry.sdk.name\",
            \"value\": {
              \"stringValue\": \"my_sdk\"
            }
          }
        ]
      },
      \"scopeMetrics\": [
        {
          \"metrics\": [
            {
              \"name\": \"hw.host.energy\",
              \"gauge\": {
                \"dataPoints\": [
                  {
                    \"timeUnixNano\": \"1712951030987780000\",
                    \"asDouble\": 211
                  }
                ]
              }
            }
          ]
        }
      ]
    }
  ]
}"
| spath
``` data emulation above ```&lt;/LI-CODE&gt;&lt;P&gt;Final thoughts about data structure with self-evidence semantics: If my speculation about&amp;nbsp;dataPoints[] being the only node of interest under resourceMetrics[].scopeMetrics[].metrics.gauge stands, good data could be further simplified to&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;[
 {
  "attributes": {"host.name":{"stringValue":"myname1"},"telemetry.sdk.name":{"stringValue":"my_sdk"}},
  "metrics": {"hw.host.energy":[{"timeUnixNano":"1712951030986039000","asDouble":359}],"hw.host.power":[{"timeUnixNano":"1712951030986039000","asDouble":26}]}
 },
 {
  "attributes": {"host.name":{"stringValue":"myname2"},"telemetry.sdk.name":{"stringValue":"my_sdk"}},
  "metrics": {"hw.host.energy":[{"timeUnixNano":"1712951030987780000","asDouble":211}]}
]&lt;/LI-CODE&gt;&lt;P&gt;I do understand that listing hw.host.energy and hw.host.power as coexisting columns is different from your illustrated output and may not suite your needs. &amp;nbsp;But presentation can easily be adapted. &amp;nbsp;Bad data structure remains bad.&lt;/P&gt;</description>
      <pubDate>Wed, 17 Apr 2024 22:18:00 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Parsing-multiple-values-in-fields-from-JSON-key-value-format/m-p/684576#M233674</guid>
      <dc:creator>yuanliu</dc:creator>
      <dc:date>2024-04-17T22:18:00Z</dc:date>
    </item>
    <item>
      <title>Re: Parsing multiple values in fields from JSON key value format</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Parsing-multiple-values-in-fields-from-JSON-key-value-format/m-p/684580#M233676</link>
      <description>&lt;P&gt;Thanks, I will check out your advice in a bit.&lt;/P&gt;&lt;P&gt;Yes, I agree that the data structure is not ideal for parsing. Unfortunately, this is output from an OpenTelemetry collector following the OpenTelemetry standard (which Spunk also embraces, though we don’t have native parsing for it yet in Splunk Enterprise), so if this takes off as the cross-vendor standard for pushing telemetry, then we are going to have to deal with ingesting in this format more and more. Or maybe it is an opportunity to suggest formatting changes to CNCF to the standard.&lt;/P&gt;</description>
      <pubDate>Wed, 17 Apr 2024 22:44:44 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Parsing-multiple-values-in-fields-from-JSON-key-value-format/m-p/684580#M233676</guid>
      <dc:creator>sholl</dc:creator>
      <dc:date>2024-04-17T22:44:44Z</dc:date>
    </item>
  </channel>
</rss>

