Splunk Search

How do you sum a field for requests that fail to meet a threshold?

bearlmax
New Member

I am trying to calculate the percentage of requests that fail to meet a threshold. Log events from this app are written every 15mins and summarize the activity during that 15min bucket of time.

The log data looks like this

{"size": 7, "max": 347.62573199999997, "p95": 347.62573199999997, "mean": 260.4413714285714, "min": 217.407228}
{"size": 14, "max": 3173.706056, "p95": 3173.706056, "mean": 917.1338762857142, "min": 160.522461}

The size field represents the number of transactions summarized in that event. I’m having a hard time figuring out how to extract the sum of the size fields for the events whose mean exceeds a threshold of 500. This is what I tried, but it doesn’t work.

mySearch | stats sum(size) as TotalCount, sum(size(eval(mean>500))) as bad | eval SLO=round(bad/TotalCount,4) * 100 | table SLO

Any suggestions?

Thanks!

Tags (2)
0 Karma

HiroshiSatoh
Champion

Try this!

|your search
| stats sum(size) as TotalCount, sum(eval(if(mean>500,size,0))) as bad 
| eval SLO=round(bad/TotalCount,4) * 100 | table SLO
0 Karma

bearlmax
New Member

Thanks! Works like a charm!

0 Karma
Get Updates on the Splunk Community!

Index This | How many sides does a circle have?

February 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

Registration for Splunk University is Now Open!

Are you ready for an adventure in learning?   Brace yourselves because Splunk University is back, and it's ...

Splunkbase | Splunk Dashboard Examples App for SimpleXML End of Life

The Splunk Dashboard Examples App for SimpleXML will reach end of support on Dec 19, 2024, after which no new ...