Splunk Search

Extract JSON objects

vishaltaneja070
Motivator

How can i extract this:
"properties": {"nextLink": null,
"columns": [
{"name": "Cost", "type": "Number"},
{"name": "Date", "type": "Number"},
{"name": "Charge", "type": "String"},
{"name": "Publisher", "type": "String"},
{"name": "Resource", "type": "String"},
{"name": "Resource", "type": "String"},
{"name": "Service", "type": "String"},
{"name": "Standard", "type": "String"},
"rows": [
[2.06, 20210807, "usage", "uuuu", "hhh", "gd", "bandwidth", "azy", "HHH"],
[2.206, 20210807, "usage", "uuuhhh", "ggg", "gd", "bandwidth", "new", "YYY"] ]

No of columns can be increased.

 

 

Labels (1)
Tags (1)
0 Karma
1 Solution

ITWhisperer
SplunkTrust
SplunkTrust

Assuming columns is supposed to be an array of name/type pairs (added closing ]) and that there are supposed to be 9 of these pairs (added Comment), and that you have a properly formatted JSON string (added surrounding and closing braces), then you could do something like this

 

| makeresults 
| eval _raw="{\"properties\": {\"nextLink\": null,
\"columns\": [
{\"name\": \"Cost\", \"type\": \"Number\"},
{\"name\": \"Date\", \"type\": \"Number\"},
{\"name\": \"Charge\", \"type\": \"String\"},
{\"name\": \"Publisher\", \"type\": \"String\"},
{\"name\": \"Resource\", \"type\": \"String\"},
{\"name\": \"Resource\", \"type\": \"String\"},
{\"name\": \"Service\", \"type\": \"String\"},
{\"name\": \"Standard\", \"type\": \"String\"},
{\"name\": \"Comment\", \"type\": \"String\"}],
\"rows\": [
[2.06, 20210807, \"usage\", \"uuuu\", \"hhh\", \"gd\", \"bandwidth\", \"azy\", \"HHH\"],
[2.206, 20210807, \"usage\", \"uuuhhh\", \"ggg\", \"gd\", \"bandwidth\", \"new\", \"YYY\"]] }}"



| spath path="properties.columns{}.name" output=columnnames
| spath path="properties.rows{}{}" output=rows
| streamstats count as event 
| mvexpand rows
| streamstats count as row by event
| eval index=(row-1)%mvcount(columnnames)
| eval name=mvindex(columnnames,index)
| eval {name}=rows
| eval row=floor((row-1)/mvcount(columnnames))
| fields - columnnames name index rows
| stats values(*) as * by row event

 

 

View solution in original post

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Assuming columns is supposed to be an array of name/type pairs (added closing ]) and that there are supposed to be 9 of these pairs (added Comment), and that you have a properly formatted JSON string (added surrounding and closing braces), then you could do something like this

 

| makeresults 
| eval _raw="{\"properties\": {\"nextLink\": null,
\"columns\": [
{\"name\": \"Cost\", \"type\": \"Number\"},
{\"name\": \"Date\", \"type\": \"Number\"},
{\"name\": \"Charge\", \"type\": \"String\"},
{\"name\": \"Publisher\", \"type\": \"String\"},
{\"name\": \"Resource\", \"type\": \"String\"},
{\"name\": \"Resource\", \"type\": \"String\"},
{\"name\": \"Service\", \"type\": \"String\"},
{\"name\": \"Standard\", \"type\": \"String\"},
{\"name\": \"Comment\", \"type\": \"String\"}],
\"rows\": [
[2.06, 20210807, \"usage\", \"uuuu\", \"hhh\", \"gd\", \"bandwidth\", \"azy\", \"HHH\"],
[2.206, 20210807, \"usage\", \"uuuhhh\", \"ggg\", \"gd\", \"bandwidth\", \"new\", \"YYY\"]] }}"



| spath path="properties.columns{}.name" output=columnnames
| spath path="properties.rows{}{}" output=rows
| streamstats count as event 
| mvexpand rows
| streamstats count as row by event
| eval index=(row-1)%mvcount(columnnames)
| eval name=mvindex(columnnames,index)
| eval {name}=rows
| eval row=floor((row-1)/mvcount(columnnames))
| fields - columnnames name index rows
| stats values(*) as * by row event

 

 

0 Karma
Get Updates on the Splunk Community!

Observability Unlocked: Kubernetes Monitoring with Splunk Observability Cloud

 Ready to master Kubernetes and cloud monitoring like the pros? Join Splunk’s Growth Engineering team for an ...

Update Your SOAR Apps for Python 3.13: What Community Developers Need to Know

To Community SOAR App Developers - we're reaching out with an important update regarding Python 3.9's ...

October Community Champions: A Shoutout to Our Contributors!

As October comes to a close, we want to take a moment to celebrate the people who make the Splunk Community ...