Splunk Search

How to parse json data containing an array and plot it on a bargraph,How to Iterate through a json array in splunk data

aayushr
New Member

I have a very limited knowledge of splunk. I am trying to parse json data containing an array and plot it on a bargraph.

The splunk events look like this:

event {
   project_name: "project1"
   data : [
       {"type":"type1","coverage":0}
       {"type":"type2","coverage":1}
       {"type":"type3","coverage":1}
       {"type":"type4","coverage":1}
       {"type":"type5","coverage":1}
       {"type":"type6","coverage":3}
   ]
}

There are multiple projects for which this event is sent. Each event has a json array with data about "type" ( ranging from type1 to type 6). There can be multiple such events with same project name over time.
What I want to do is to take the last event for each "project_name" and plot a bar graph comparing "coverage" for different "type"s for different projects.
Anyone has any ideas how I might achieve that.

,I have data with the following structure:

event {
project_name: "project1"
data: [
{"type":"type1","missed":1381,"covered":177,"coverage":11}
{"type":"type2","missed":11797,"covered":3134,"coverage":20}
{"type":"type3","missed":2638,"covered":613,"coverage":18}
{"type":"type4","missed":1577,"covered":140,"coverage":8}
]
}

There are multiple different projects from project1 to projectn. Different events can have same project name.
What I want to do is get the latest event for a each project and plot a bar graph comparing "coverage" of each project for a given "type".

Tags (2)
0 Karma

kamlesh_vaghela
SplunkTrust
SplunkTrust

Hi @aayushr,

Can you try following search? This search will gives you all project's details having latest event has type=type1.

YOUR_SEARCH 
| dedup project_name 
| rename data{}.coverage as coverage,data{}.covered as covered, data{}.missed as missed, data{}.type as type 
| eval temp = mvzip(mvzip(mvzip(coverage,covered),missed),type) 
| stats count by _time project_name temp 
| eval coverage=mvindex(split(temp,","),0), covered=mvindex(split(temp,","),1),missed=mvindex(split(temp,","),2),type=mvindex(split(temp,","),3) 
| where type="type1" 
| table project_name type missed covered coverage

Following is my sample search:

| makeresults | eval _raw="{\"project_name\":\"project1\",\"data\":[{\"type\":\"type1\",\"missed\":1381,\"covered\":177,\"coverage\":11},{\"type\":\"type2\",\"missed\":11797,\"covered\":3134,\"coverage\":20},{\"type\":\"type3\",\"missed\":2638,\"covered\":613,\"coverage\":18},{\"type\":\"type4\",\"missed\":1577,\"covered\":140,\"coverage\":8}]}" | append [| makeresults | eval _raw="{\"project_name\":\"project2\",\"data\":[{\"type\":\"type1\",\"missed\":1381,\"covered\":177,\"coverage\":11},{\"type\":\"type2\",\"missed\":11797,\"covered\":3134,\"coverage\":20},{\"type\":\"type3\",\"missed\":2638,\"covered\":613,\"coverage\":18},{\"type\":\"type4\",\"missed\":1577,\"covered\":140,\"coverage\":80}]}" ] | kv | dedup project_name | rename data{}.coverage as coverage,data{}.covered as covered, data{}.missed as missed, data{}.type as type | eval temp = mvzip(mvzip(mvzip(coverage,covered),missed),type) | stats count by _time project_name temp | eval coverage=mvindex(split(temp,","),0), covered=mvindex(split(temp,","),1),missed=mvindex(split(temp,","),2),type=mvindex(split(temp,","),3) | where type="type1" | table project_name type missed covered coverage

You can add criteria in where condition as per your requirement. Let me know if any further assistance required.
Thanks
Happy Splunking

Get Updates on the Splunk Community!

Security Professional: Sharpen Your Defenses with These .conf25 Sessions

Sooooooooooo, guess what. .conf25 is almost here, and if you're on the Security Learning Path, this is your ...

First Steps with Splunk SOAR

Our first step was to gather a list of the playbooks we wanted and to sort them by priority.  Once this list ...

How To Build a Self-Service Observability Practice with Splunk Observability Cloud

If you’ve read our previous post on self-service observability, you already know what it is and why it ...