Splunk Search

How do you sum a field for requests that fail to meet a threshold?

New Member

I am trying to calculate the percentage of requests that fail to meet a threshold. Log events from this app are written every 15mins and summarize the activity during that 15min bucket of time.

The log data looks like this

{"size": 7, "max": 347.62573199999997, "p95": 347.62573199999997, "mean": 260.4413714285714, "min": 217.407228}
{"size": 14, "max": 3173.706056, "p95": 3173.706056, "mean": 917.1338762857142, "min": 160.522461}

The size field represents the number of transactions summarized in that event. I’m having a hard time figuring out how to extract the sum of the size fields for the events whose mean exceeds a threshold of 500. This is what I tried, but it doesn’t work.

mySearch | stats sum(size) as TotalCount, sum(size(eval(mean>500))) as bad | eval SLO=round(bad/TotalCount,4) * 100 | table SLO

Any suggestions?


Tags (2)
0 Karma


Try this!

|your search
| stats sum(size) as TotalCount, sum(eval(if(mean>500,size,0))) as bad 
| eval SLO=round(bad/TotalCount,4) * 100 | table SLO
0 Karma

New Member

Thanks! Works like a charm!

0 Karma
Get Updates on the Splunk Community!

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...