I can't speak to your exact 4 hour need, however, I can show you how I did it minute by minute. I think you should be able to combine what was outlined by dwaddle above with what I have below. You can tweak it as well if you don't want need to calculate an entire weeks worth of data at time.
Summary Index: runs every minute.
_time User_Count_Per_Minute
8/19/13 12:00:00 AM 5
8/19/13 12:01:00 AM 6
8/19/13 12:02:00 AM 6
Everynight, I have a CSV file that caclulates the previous days averages and is used throughout the day. (It actually creates an entire weeks worth of data. This is incase I want to run ad-hoc queries against a day other than the current day)
Runs @ 12:05AM every night
CSV Search
index=your_summary source="your_summary_index"
earliest=-30d@d latest=@d
| eval equalized_time = strftime(_time,"%A %H:%M)
| stats avg(User_Count_Per_Minute) AS Per_Minute_Average by equalized_time
| table equalized_time Per_Minute_Average
| outputlookup 30_day_user_average.csv
Results (Ends up with about 10K rows)
equalized_time Per_Minute_Average
Fri 5:05 30
Fri 5:06 95
....
Fri 23:59 13
Dashboard View - Used to graph the last 3 hours of activity
index=your_summary source="your_summary_index" earliest=-3h@h latest=-1m@m
| eval equalized_time = strftime(_time,"%A %H:%M)
| fields + _time equalized_time User_Count_Per_Minute
| JOIN equalized_time
[
|inputlookup 30_day_user_average.csv
| fields + equalized_time Per_Minute_Average
]
| table _time equalized_time User_Count_Per_Minute Per_Minute_Average
It is also possible to use appendcols which is how my search orginally started before I used a summary index. The appendcols only works after a stats aggreagtion command which is no longer present because it is performed in the summary index.
... View more