Splunk Search

How to get Delta for specified time range

xvxt006
Contributor

Hi,

we are monitoring some of the counts and we would like to get the delta from last hour to this hour.This job run every 10 mins. Say the count is 10 in the previous hour and if the count is 15 in the current hour, i would like to see the result as 5 (as a single number).

So i tried the below. But i am missing something and i am not getting the right result. I looked at the delta function but that gives just the delta between current event and last event.

Can some one help?

... earliest=-1h@h latest=now | stats last(Order) as Count | eval Delta=Count-Count1 | table Delta | append [search sourcetype="xxxxxx" earliest=-2h@h latest=-1h@h | stats last(Order) as Count1]

Tags (1)
0 Karma
1 Solution

anssntaco
Path Finder

For starters, you're using Count1 before it's been defined (which happens later in the search). If anything, you'd want to put that eval after the append. But, that subsearch might not even be needed. Might want to try something like this:

... earliest=-2h@h | eval hour=strftime(_time,"%H") | stats last(Order) as Count by hour | delta Count as Delta

View solution in original post

0 Karma

lguinn2
Legend

Try this

yoursearchhere earliest=-1h@h latest=now 
| bucket _time span=1h
| stats last(Order) as Count by _time
| eval Count = if(_time = relative_time(now(),"-1h@h"),-Count,Count)
| addcoltotals Count

If you want only one result, do this

yoursearchhere earliest=-1h@h latest=now 
| bucket _time span=1h
| stats last(Order) as Count by _time
| eval Count = if(_time = relative_time(now(),"-1h@h"),-Count,Count)
| stats sum(Count) as OverallCount

HTH

0 Karma

lguinn2
Legend

Yes, see updated answer!

0 Karma

xvxt006
Contributor

Actually both the solutions gave me answer.But i could select only one as acceptable answer. I am looking for a single number. Is it possible to get a single number?

0 Karma

xvxt006
Contributor

Hi, Thank you i got the Count as 0 but i am seeing count and - count also. Can i just final results? i want to use gauge chart. So i just need one value.

0 Karma

anssntaco
Path Finder

For starters, you're using Count1 before it's been defined (which happens later in the search). If anything, you'd want to put that eval after the append. But, that subsearch might not even be needed. Might want to try something like this:

... earliest=-2h@h | eval hour=strftime(_time,"%H") | stats last(Order) as Count by hour | delta Count as Delta

0 Karma

xvxt006
Contributor

Thank you so much for the detailed explanation. I am planning to use this delta in a guage chart. So can i just get the final delta value?

0 Karma

anssntaco
Path Finder

If you were to extend your search to more than the last 2 hours, you'd have as many rows as there were hours. And for each hour, the value of Delta would be the difference between Count for that hour and that from the previous hour.

0 Karma

anssntaco
Path Finder

it's not so much converting time as it is creating a new field called hour. The reason for doing that is that, using the stats function, you can then calculate last(Order), which is then saved as Count, for each of the previous 2 hours. The delta function then computes the difference between Count for a given hour to that from the previous hour, which it then saves as Delta. You then end up with a results table that has 3 columns: hour, Count, Delta. Since you're only looking over the last 2 hours, the table would have 2 rows.

0 Karma

xvxt006
Contributor

Hi, Thank you for your answer. I am wondering why do we even need to convert time to %H? if i have earliest and latest in hour format (snap)? just to be clear, if the current hour is 9 PM, then i would like to get the count from 7 to 8 PM and get the count for 8 pm to 9 pm and get the delta. Does your answer provide that?

0 Karma
Get Updates on the Splunk Community!

Earn a $35 Gift Card for Answering our Splunk Admins & App Developer Survey

Survey for Splunk Admins and App Developers is open now! | Earn a $35 gift card!      Hello there,  Splunk ...

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...