Splunk Search

How to include no event as 0 in avg calculation?

Dantuzzo
Loves-to-Learn Lots

Hi,

i'm struggling in calculating hourly or daily average and displaying the results if there's no events at all, which in theory should be calculated as 0 and included in the avg calculation. Currently my query calculates the avg in a given timespan only if there are events for a specific severity, if not it remains blank and is not included in the avg calculation, query, results and expected results:    

 

index=myindex earliest=-7d@d latest=now()
    | bin _time span=1h
    | fields _time, severity
    | stats count by _time, severity
    | eval previous=if(_time<relative_time(now(),"@d"),count,null())
    | eventstats avg(previous) as average by severity
    | eval change_percent=round(((count-average)*100)/count,0)."%"
    | table _time severity count average change_percent

 

_timeseveritycountaveragechange_percent
2022-12-16 10:00High22.25-12%
2022-12-16 12:00Low220%
2022-12-16 14:00Medium3233%

 

 

i'd like to show something like this:

_timeseveritycountaveragechange_percent
2022-12-16 10:00-11:00High20.5x%
2022-12-16 10:00-11:00Medium01-x%
2022-12-16 10:00-11:00 Low01x%
2022-12-16 11:00-12:00High00.5 x%
2022-12-16 11:00-12:00Medium01 x%
2022-12-16 11:00-12:00Low01 x%
2022-12-16 12:00-13:00High00.5 x%
2022-12-16 12:00-13:00Medium01 x%
2022-12-16 12:00-13:00Low21x%

 

thank you for any help 

Labels (3)
0 Karma

yuanliu
SplunkTrust
SplunkTrust

Also using eventstats, I have a more literal filler method:

index=myindex earliest=-7d@d latest=now()
| bin _time span=1h
| fields _time, severity
| addinfo ``` so we know how many hours are in search period ```
| eval filler = mvrange(0, round((info_max_time - info_min_time) / 3600))
| stats count values(filler) as filler by _time, severity
| appendpipe
    [ eval count = 0
    | mvexpand filler ``` account for every hour ```
    | eval _time = relative_time(now(), "-" . tostring(filler) . "h@h")
    | bin span=1h _time]
| stats sum(count) as count by _time, severity ``` count + 0 == count ```
| eval previous=if(_time<relative_time(now(),"@d"),count,null())
| eventstats avg(previous) as average by severity
| eval change_percent=round(((count-average)*100)/count,0)."%"
0 Karma

richgalloway
SplunkTrust
SplunkTrust

Have you met appendpipe?  The command defines a subpipeline that runs against the current results and often is used to insert values when there are no results.  In this case, we use three calls to appendpipe because we have three values to test for and insert if missing.

index=_internal earliest=-7d@d latest=now()
| bin _time span=1h 
| fields _time, log_level 
| stats count by _time, log_level 
| appendpipe 
    [ stats values(log_level) as levels by _time 
    | eval log_level=if(isnull(mvfind(levels,"ERROR")),"ERROR", null()),count=0 
    | where isnull(mvfind(levels,"ERROR")) 
    | fields - levels ] 
| appendpipe 
    [ stats values(log_level) as levels by _time 
    | eval log_level=if(isnull(mvfind(levels,"WARN")),"WARN", null()),count=0 
    | where isnull(mvfind(levels,"WARN")) 
    | fields - levels ] 
| appendpipe 
    [ stats values(log_level) as levels by _time 
    | eval log_level=if(isnull(mvfind(levels,"INFO")),"INFO", null()),count=0 
    | where isnull(mvfind(levels,"INFO")) 
    | fields - levels ] 
| sort + _time 
| eval previous=if(_time<relative_time(now(),"@d"),count,null()) 
| eventstats avg(previous) as average by log_level 
| eval change_percent=round(((count-average)*100)/count,0)."%" 
| table _time log_level count average change_percent

 

---
If this reply helps you, Karma would be appreciated.
0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Can’t Make It to Boston? Stream .conf25 and Learn with Haya Husain

Boston may be buzzing this September with Splunk University and .conf25, but you don’t have to pack a bag to ...

Splunk Lantern’s Guide to The Most Popular .conf25 Sessions

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Unlock What’s Next: The Splunk Cloud Platform at .conf25

In just a few days, Boston will be buzzing as the Splunk team and thousands of community members come together ...