Splunk Search

Can I use relative time for bin span?

mpatel11
Explorer

I want to run a query with rolling time span (rolling every minute) and want to count events in last 1 hour relative to current minute.

I am trying to run this query:

Search query |  bin _time span=(now(), "-1h") | stats range(_time) AS Range,  latest(_time) AS Latest count BY A, B, C, date_hour

but of course span does not accept -ve values.

Example:
10:04 - xxxxx
10:06 - xxxxx
10:09 - xxxxx
10:16 - xxxxx
11:07 - xxxxx
11:14 - xxxxx

so if my current time is 11:08 and if i say count events for last 1 hour from now. so it should count in time range 10:08 - 11:08 so that the count value is 3. if i simply use bin _time span=1h, it would return count value as 4 for the 10th hour and 2 for 11th hour.

so basically i want my time span to be rolling each minute and then get a count for last 60 mins (1hr) exact.

0 Karma
1 Solution

acharlieh
Influencer

What if you calculated an offset relative to the current minute, adjusted all _time to use that offset prior to bin, and then after bin makes your 60 minute wide buckets, restore the offset? Here's a mock to demonstrate:

| makeresults count=36000 | streamstats count | eval _time=_time-count, thetime=_time 
| eval offset=relative_time(now(),"@m")-relative_time(now(),"@h"), _time=_time-offset 
| bin _time span=1h 
| eval _time=_time+offset 
| stats min(thetime) max(thetime) count by _time | convert ctime(*thetime*)

Alternatively, because your bins are hour width... you might be able to just do this in one eval statement like so:

base search 
| eval offset=relative_time(now(),"@m")-relative_time(now(),"@h"), _time=relative_time(_time-offset,"@h")+offset
| stats X by _time

I should also note, date_hour is the hour parsed from the timestamp of the log, whereas _time is the UTC epoch time, which is then displayed in local time zone. Calculations on _time can sometime get funny in timezones where they are not full hour offsets from UTC.

View solution in original post

acharlieh
Influencer

What if you calculated an offset relative to the current minute, adjusted all _time to use that offset prior to bin, and then after bin makes your 60 minute wide buckets, restore the offset? Here's a mock to demonstrate:

| makeresults count=36000 | streamstats count | eval _time=_time-count, thetime=_time 
| eval offset=relative_time(now(),"@m")-relative_time(now(),"@h"), _time=_time-offset 
| bin _time span=1h 
| eval _time=_time+offset 
| stats min(thetime) max(thetime) count by _time | convert ctime(*thetime*)

Alternatively, because your bins are hour width... you might be able to just do this in one eval statement like so:

base search 
| eval offset=relative_time(now(),"@m")-relative_time(now(),"@h"), _time=relative_time(_time-offset,"@h")+offset
| stats X by _time

I should also note, date_hour is the hour parsed from the timestamp of the log, whereas _time is the UTC epoch time, which is then displayed in local time zone. Calculations on _time can sometime get funny in timezones where they are not full hour offsets from UTC.

mpatel11
Explorer

for the 1st query above i removed makeresults and streamstats and re-did the evaluation on temp field of time instead of _time , as below and it gave me the same results.

base search
| eval t=_time
| eval offset=relative_time(now(),"@m")-relative_time(now(),"@h"), t=t-offset
| bin t span=1h
| eval t=t+offset
| stats range(_time) AS Range, latest(_time) AS Latest count BY t, ErrCode
| eval LastEvent=strftime(Latest, "%Y-%m-%d %H:%M:%S"), t=strftime(t, "%Y-%m-%d %H:%M:%S")

is this correct way? as per the results, its the same as using the initial query with makeresults and streamstats

moreover curious on "eval _time=_time-count" to what does it do as per calculations? since _time would be in time format and count would be a number

0 Karma

acharlieh
Influencer

_time is only specially formatted for display, it is actually just the number of seconds since Jan 1, 1970 (Unix Epoch Time) as you can see when you do something like 'eval foo=_time`

makeresults, streamstats, and that first eval are me setting up data for a run-anywhere example, namely artificially creating one result per second for the last 10 hours (60 seconds per minute * 60 minutes per hour * 10 hours = count of 36000 events). streamstats count assigns an ordinal (count) to each result, and I can subsequently remove that number of seconds from the nth result. you are correct that it would be replaced with your base search (well you may need the field copy of _time in that first eval).

0 Karma

mpatel11
Explorer

Hey, I tried the 2nd query and it works perfect. Thank you.
I still dont know clearly how it works. Can you help me understand or provide some link to some documentations?
as you mentioned this will work only in case my time span is one hour, in case i need to change it to x minutes i believe there needs to be a different evaluation.

one other thing I noticed is this does a stats count by changing the event time itself? so if current time is 09:22 and my 2 events occurred at 09:01 and 08:40, the query returns me the correct count of events as 2 that occurred between 08:22 and 09:22. but it also changes the _time field values as well.. it now shows that both events occurred on 08:22 which is exact 1 hour before now.

in other words i would like to get the stats count from time span relative to now, but keep the original event times as it is. so output should give stats count of 2, but also latest(_time) should give me 09:01.

I did this by saving the _time in temporal field say Time ( eval Time=_time) and performed time operations on the field Time.
so latest(Time) now gives me the correct value when the latest event occurred in that interval. Is this correct way to do or is there a better way?

0 Karma

acharlieh
Influencer

bin and eval are indeed manipulating the _time field for the latter steps of the search. And if you need to preserve the original values as well the best way is to eval into a different field, either copying the original or doing your calculations into a new field

Docs on eval functions, http://docs.splunk.com/Documentation/Splunk/7.0.0/SearchReference/CommonEvalFunctions that links to more details on different classes of functions used with eval like SPL commands

Docs for the bin aka bucket command: http://docs.splunk.com/Documentation/Splunk/7.0.0/SearchReference/bin

0 Karma

mpatel11
Explorer

Thanks for the reply. i have not used time modifiers before so will have a bit of a learning curve here. ill definitely try this out.

0 Karma

richgalloway
SplunkTrust
SplunkTrust

Have you tried a 60-minute span? ... | bin _time span=60m | ...

---
If this reply helps you, Karma would be appreciated.
0 Karma

mpatel11
Explorer

But now i have another issue with this. the moment i use the bin command my Range values in the stats go blank.
so when i say, Search query | bin _time span=(now(), "-1h") | stats range(_time) AS Range, latest(_time) AS Latest count BY _time, A, B, C, basically adding time based columns in the stats command, the Range value goes to 0 when i use bin command.

0 Karma

mpatel11
Explorer

Hey, thanks for the reply. yes i did and for some reason it treats 60m same as 1h, i can see bucketing being done hour to hour. when i change it to 59m, then the bucketing changes to exact last 59mins.

0 Karma
Get Updates on the Splunk Community!

What's New in Splunk Enterprise 9.4: Features to Power Your Digital Resilience

Hey Splunky People! We are excited to share the latest updates in Splunk Enterprise 9.4. In this release we ...

Take Your Breath Away with Splunk Risk-Based Alerting (RBA)

WATCH NOW!The Splunk Guide to Risk-Based Alerting is here to empower your SOC like never before. Join Haylee ...

SignalFlow: What? Why? How?

What is SignalFlow? Splunk Observability Cloud’s analytics engine, SignalFlow, opens up a world of in-depth ...