Splunk Search

How to calcualte an Eval of percentage between two rows of stats values?

becksyboy
Communicator

Hi All,

I'm trying to calculate the failureRate as a percentage between the NumberOfAuthErrors column and the TotalRequest column, but i do not get any values.

I do have two columns of values. I would like to calculate the failureRate for each ROW.

 

[SEARCH]
| bin _time span=15m
| stats count as NumberOfAuthErrors by _time
| append
[ SEARCH | bin _time span=15m | stats count as TotalRequest by _time ]
| stats values(NumberOfAuthErrors) AS NumberOfAuthErrors, values(TotalRequest) AS TotalRequest
| eval failureRate = round((NumberOfAuthErrors / TotalRequest) * 100,3)
| table TotalRequest NumberOfAuthErrors failureRate

 

 

thanks

Labels (1)
0 Karma
1 Solution

ITWhisperer
SplunkTrust
SplunkTrust

Try including by _time on this line

| stats values(NumberOfAuthErrors) AS NumberOfAuthErrors, values(TotalRequest) AS TotalRequest by _time

View solution in original post

dtburrows3
Builder

By the look of your screenshot shared, it appears that as a result of the 

| stats values(NumberOfAuthErrors) AS NumberOfAuthErrors, values(TotalRequest) AS TotalRequest

is returning you two multivalued fields, so the eval is not working as intended.
Try putting this stats with an additional by-field of _time again, that way each NumberOfAuthErrors and TotalRequest values should only have 1 value for each 15 minute interval and then the eval will probably work.

If for whatever reason you are trying to sum up each row of two multivalued fields (Don't really know why you would want to do this), I would stay away from using stats values() as this is going to dedup values and then I believe sort them. using stats list() instead will retain the original order, but even then, if one of the datasets is missing events in one or more of the 15 minute intervals, then number will again be misaligned.

You would be better off just using a stats by-field of _time again, something like this.

[SEARCH]
| bin _time span=15m
| stats count as NumberOfAuthErrors by _time
| append
[ SEARCH | bin _time span=15m | stats count as TotalRequest by _time ]
| stats values(NumberOfAuthErrors) AS NumberOfAuthErrors, values(TotalRequest) AS TotalRequest by _time
| eval failureRate = round((NumberOfAuthErrors / TotalRequest) * 100,3)
| table _time, TotalRequest NumberOfAuthErrors failureRate


If you just want the overall failureReate through the entire timespan the using a stats sum() will probably be the way to go.

[SEARCH]
| bin _time span=15m
| stats count as NumberOfAuthErrors by _time
| append
[ SEARCH | bin _time span=15m | stats count as TotalRequest by _time ]
| stats sum(NumberOfAuthErrors) AS NumberOfAuthErrors, sum(TotalRequest) AS TotalRequest
| eval failureRate = round((NumberOfAuthErrors / TotalRequest) * 100,3)
| table TotalRequest NumberOfAuthErrors failureRate

 

becksyboy
Communicator

Thank you also for the same solution and the additional context, this is really helpful.

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Try including by _time on this line

| stats values(NumberOfAuthErrors) AS NumberOfAuthErrors, values(TotalRequest) AS TotalRequest by _time

becksyboy
Communicator

Thank You for the help, the missing by clause worked (by _time)

0 Karma
Get Updates on the Splunk Community!

Join Us for Splunk University and Get Your Bootcamp Game On!

If you know, you know! Splunk University is the vibe this summer so register today for bootcamps galore ...

.conf24 | Learning Tracks for Security, Observability, Platform, and Developers!

.conf24 is taking place at The Venetian in Las Vegas from June 11 - 14. Continue reading to learn about the ...

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...