Getting Data In

How can we speed up JSON array lookup?

vinodkd
New Member

Hi,

My JSON event is in this form:

{
TotalMemUsage : 887992,
ProcMemUsage : [
{
Name : "firefox",
PID : 758,
Usg : 228972,
},
{
Name : "eclipse",
PID : 569,
Usg : 19756,
}
]
}

I've to take average of each process's memory usage and draw a chart.
For the time being, I use following query.

my_search 
| rename ProcMemUsage{}.Name as PNAME | rename ProcMemUsage{}.Usg as PUSAGE
| eval x=mvzip(PNAME,PUSAGE) 
| mvexpand x|eval x=split(x,",")
| eval P_NAME=mvindex(x,0)
| eval P_USAGE = mvindex(x,1)
| stats avg(P_USAGE) as MU by P_NAME 
| sort -MU | fields MU,P_NAME
|chart avg(MU) as AvgMU by P_NAME 
| sort -AvgMU

But it takes lot of time to complete the operation. (Approx 5 minutes with only 30K records.).
Is there any way to optimize it? Can we use jsonutils somehow?

Tags (3)
0 Karma

vinodkd
New Member

@Iguinn: It doesn't give a noticeable speedup :'(

0 Karma

lguinn2
Legend

I don't really think this will make things faster, but it might help a little:

my_search 
| rename ProcMemUsage{}.Name as PNAME | rename ProcMemUsage{}.Usg as PUSAGE
| fields PNAME PUSAGE
| eval x=mvzip(PNAME,PUSAGE) 
| mvexpand x
| eval x=split(x,",")
| eval P_NAME=mvindex(x,0)
| eval P_USAGE = mvindex(x,1)
| chart avg(P_USAGE) as AvgMU by P_NAME 
| sort -AvgMU

It's also a bit cleaner. Post back with your results.

0 Karma
Get Updates on the Splunk Community!

AI for AppInspect

We’re excited to announce two new updates to AppInspect designed to save you time and make the app approval ...

App Platform's 2025 Year in Review: A Year of Innovation, Growth, and Community

As we step into 2026, it’s the perfect moment to reflect on what an extraordinary year 2025 was for the Splunk ...

Operationalizing Entity Risk Score with Enterprise Security 8.3+

Overview Enterprise Security 8.3 introduces a powerful new feature called “Entity Risk Scoring” (ERS) for ...