Deployment Architecture

Using the number of events in bins to find percentile

jonnymolina
Engager

Hello all,

I have a seemingly simple goal: bucketing events by time and finding the 95th percentile using the total number of events in each bin. I'm able to get the counts for each bin but I'm not sure how to use each of those counts and find the percentile using p().

This is how I'm getting the count for each bin:
| bin _time span=5m | stats count by _time

Now I want to use the values in the count column as an input list to calculate p95().

Thanks for the help in advance.

0 Karma
1 Solution

renjith_nair
Legend

@jonnymolina,

Does it work for you ?

index=_internal |bucket span=5m _time|stats count by _time|eventstats perc95(count) as p95
Happy Splunking!

View solution in original post

renjith_nair
Legend

@jonnymolina,

Does it work for you ?

index=_internal |bucket span=5m _time|stats count by _time|eventstats perc95(count) as p95
Happy Splunking!

jonnymolina
Engager

I also figured out you can return a single p95 value by doing

| bin _time span=5m | stats count by _time | stats perc95(count) as p95

0 Karma

renjith_nair
Legend

you could do that as well if you only need p95 , all other fields will be gone. Evenstats is used to keep all the fields in the output

Happy Splunking!
0 Karma

jonnymolina
Engager

This works! Thank you so much. As a side topic, could this be achieved through streamstats as well?

0 Karma

renjith_nair
Legend

I wouldn't do it with streamstats because its specially created for creating "streaming" events where as eventstats acts on the events by events

Happy Splunking!
0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...