Splunk Search

How do you sum a field for requests that fail to meet a threshold?

bearlmax
New Member

I am trying to calculate the percentage of requests that fail to meet a threshold. Log events from this app are written every 15mins and summarize the activity during that 15min bucket of time.

The log data looks like this

{"size": 7, "max": 347.62573199999997, "p95": 347.62573199999997, "mean": 260.4413714285714, "min": 217.407228}
{"size": 14, "max": 3173.706056, "p95": 3173.706056, "mean": 917.1338762857142, "min": 160.522461}

The size field represents the number of transactions summarized in that event. I’m having a hard time figuring out how to extract the sum of the size fields for the events whose mean exceeds a threshold of 500. This is what I tried, but it doesn’t work.

mySearch | stats sum(size) as TotalCount, sum(size(eval(mean>500))) as bad | eval SLO=round(bad/TotalCount,4) * 100 | table SLO

Any suggestions?

Thanks!

Tags (2)
0 Karma

HiroshiSatoh
Champion

Try this!

|your search
| stats sum(size) as TotalCount, sum(eval(if(mean>500,size,0))) as bad 
| eval SLO=round(bad/TotalCount,4) * 100 | table SLO
0 Karma

bearlmax
New Member

Thanks! Works like a charm!

0 Karma
Get Updates on the Splunk Community!

Splunk App for Anomaly Detection End of Life Announcment

Q: What is happening to the Splunk App for Anomaly Detection?A: Splunk is officially announcing the ...

Aligning Observability Costs with Business Value: Practical Strategies

 Join us for an engaging Tech Talk on Aligning Observability Costs with Business Value: Practical ...

Mastering Data Pipelines: Unlocking Value with Splunk

 In today's AI-driven world, organizations must balance the challenges of managing the explosion of data with ...