Splunk Search

Need to change the span based on hour of day

wjrbrady
Loves-to-Learn

Hello ,

I am trying to change in the search itself to change the span in timechart.  So if the hour is say greater than 7 and less than 19 make the span=10m  else 1hr

example

| eval hour=strftime(_time,"%H")
| eval span=if(hour>=7 AND hour<19,"10m","1h")
|timechart span=span count(field1) ,count(field2) by field3

Labels (2)
0 Karma

yuanliu
SplunkTrust
SplunkTrust

@bowesmana and @PrewinThomas give you two different approaches.  I will put a different spin on Prewin27's append method. (BTW, there should be no need to sort by _time after timechart.)  To avoid searching the same data multiple times, I use map.

In the following example, I simplify interval split by restricting total search window to -1d@d - -0d@d.

| tstats count where index=_internal earliest=-1d@d latest=-0d@d
| addinfo ``` just to extract boundaries ```
| eval point1 = relative_time(info_min_time, "+7h"), point2 = relative_time(info_min_time, "+17h")
| eval interval = mvappend(json_object("earliest", info_min_time, "latest", point1),
  json_object("earliest", point1, "latest", point2),
  json_object("earliest", point2, "latest", info_max_time))
| mvexpand interval
| spath input=interval
| eval span = if(earliest == point1, "10m", "1h")
``` the above uses prior knowledge about point1 and point2 ```
| map search="search index=_internal earliest=$earliest$ latest=$latest$
  | timechart span=$span$ count"

Screenshot 2025-05-27 at 11.17.48 PM.png

Obviously if your search window is not one 24-hour period, interval split becomes more complex.  But the same logic can apply to any window.

Tags (1)
0 Karma

bowesmana
SplunkTrust
SplunkTrust

You can technically achieve this through post processing of the timechart data. All you do is create your timechart in the smaller span, then add up the 6 X 10 minute blocks outside your time range and remove the unnecessary ones. 

Here's an example using streamstats/eventstats - there are probably other ways, but this works 

index=_audit
| timechart span=10m count
| eval t=strftime(_time, "%H")
| streamstats window=6 sum(eval(if(t>=7 AND t<19, null(), count))) as hourly by t
| eventstats max(hourly) as hourly_max min(hourly) as hourly_min by t
| where hourly=hourly_min OR isnull(hourly)
| eval hourly=hourly_max
| fields - hourly* t

You could make it simpler depending on your total search time range.

You will see the X axis will not change, but you will only have hourly data points in the 19-07 hours.

0 Karma

PickleRick
SplunkTrust
SplunkTrust

Be aware though that not all aggregation functions are further aggregatable.

For example - sum or max/min can be aggregated from smaller spans into a correct overall value but avg cannot.

0 Karma

bowesmana
SplunkTrust
SplunkTrust

bit nerdy here, but @PickleRick if you know in advance what you want to do and can figure out the maths, then you can do others, e.g. post aggregation of average is simply sum/count

index=_audit
| eval r=random() % 100
| timechart span=10m avg(r) as avg_r sum(r) as s_r count
| eval h=strftime(_time, "%H"), d=strftime(_time, "%d"), m=strftime(_time, "%M")
| eventstats sum(count) as count_1_hour sum(s_r) as sum_r_1_hour by d h
| where (h>=7 AND h<19 OR m=0)
| eval avg_r = if(h<7 OR h>=19, sum_r_1_hour / count_1_hour, avg_r)
| fields - d h m sum_r_1_hour count_1_hour s_r

percentiles on the other hand are a little more complicated. I suspect using the sitimechart function will do a lot of the work for the first pass and then it's a bit of post_processing of the psrsvd_rd* variables. I'm not totally sure how the si_* values are aggregated for percentiles, I did play around with it some years ago and got lost in the weeds, but it was a somewhat interesting exercise

0 Karma

PickleRick
SplunkTrust
SplunkTrust

Of course. I know it, you know it... But people tend to forget it. Way too many times I've seen average speed or average fuel consumption calculated by averaging multiple averages.

0 Karma

PrewinThomas
Builder

@wjrbrady 

Splunk timechart command’s span argument must be a fixed value per search execution—you cannot dynamically change the span within a single timechart based on the hour of the day.

However, you can achieve similar logic using a combination of eval, bin, and append

Eg: using append
(
search ... earliest=@d latest=now
| eval hour=strftime(_time,"%H")
| where hour > 7 AND hour < 19
| timechart span=10m sum(count) as count
)
| append
(
search ... earliest=@d latest=now
| eval hour=strftime(_time,"%H")
| where hour <= 7 OR hour >= 19
| timechart span=1h sum(count) as count
)
| sort _time

Also if you want a single timeline but with custom buckets, you can create your own time buckets using eval and bin


Regards,
Prewin
Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a kudos. Thanks!

0 Karma

wjrbrady
Loves-to-Learn

thank you for the eg i will take a look.  also i tried to do the eval bin but it would not let me do an if or case statement to set the bin size.  Do you have an example

0 Karma

PickleRick
SplunkTrust
SplunkTrust

How would results of such search look? Do you want to change the span for the whole search or have multiple spans within one search? (what sense would it make then???)

0 Karma

wjrbrady
Loves-to-Learn

Hello picklerick,

I was trying to do a compare to today and last week but based on volume of data in the over night i wanted the data in an hour bucket and during the day wanted 10minute buckes.  This would be for an alert where you cant use tokens based on time to set the span.  So it would be for the whole search

0 Karma

wjrbrady
Loves-to-Learn

thanks for the update

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @wjrbrady ,

I'm sorry but it isn't possible to dinamically change the span value in a timechart command.

You have to define a value.

Ciao.

Giuseppe

0 Karma
Get Updates on the Splunk Community!

Why You Can't Miss .conf25: Unleashing the Power of Agentic AI with Splunk & Cisco

The Defining Technology Movement of Our Lifetime The advent of agentic AI is arguably the defining technology ...

Deep Dive into Federated Analytics: Unlocking the Full Power of Your Security Data

In today’s complex digital landscape, security teams face increasing pressure to protect sprawling data across ...

Your summer travels continue with new course releases

Summer in the Northern hemisphere is in full swing, and is often a time to travel and explore. If your summer ...