Getting Data In

Need help a parson json and extract in table format

usharaniallwyn
New Member

Hi ,
I have a json and i want to extract few details in table format .

The json array is like
[features{
elements{
steps{
name
}
}
}
failed:2,
passed:0]

My query:

source="jsondata.json"  index="art" sourcetype="_json"|mvexpand "features{}.elements{}.failed"|rename "features{}.elements{}.failed" as FailedNumber| eval Status=if(FailedNumber=0,"Pass","Fail")|table Status,FailedNumber

Status FailedNumber

Fail 2
Pass 0
Fail 1

second query :

source="jsondata.json" host="CDC2-L-CG72VP2" index="art" sourcetype="_json"|spath output=myfield path="features{}.elements{}.steps{0}.name"|mvexpand myfield |table myfield

myfield↕

the testcase name is "ValidateNetworkBHUtilization"
the testcase is ValidateTrendAmbulatoryCondition
the testcase is TrendHomeHealthCondition

I want ,

Status FailedNumber myfield↕

Fail 2 the testcase name is "ValidateNetworkBHUtilization"
Pass 0 the testcase is ValidateTrendAmbulatoryCondition
Fail 1 the testcase is TrendHomeHealthCondition

Tags (2)
0 Karma

woodcock
Esteemed Legend

Like this:

index="art" source="jsondata.json"  sourcetype="_json"
| multireport
[ mvexpand "features{}.elements{}.failed"|rename "features{}.elements{}.failed" as FailedNumber
| eval Status=if(FailedNumber=0,"Pass","Fail")
|table Status,FailedNumber
|stats count AS _serial]
[ search host="CDC2-L-CG72VP2"
|spath output=myfield path="features{}.elements{}.steps{0}.name"
|mvexpand myfield
|table myfield
| stats count AS _serial]
| selfjoin _serial
0 Karma
Get Updates on the Splunk Community!

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...