Splunk Search

How to search the average number of transactions by hour by day?

RVDowning
Contributor

Am doing the following trying to get the average number of transactions by hour by day:

| bucket _time span=1h 
| stats avg(count) AS "Average Transaction Count" by date_hour, date_wday

By leaving out "avg", I can get the total count of transactions by hour by day, but I would just like to get the average of them.

1 Solution

sideview
SplunkTrust
SplunkTrust

OK I understand.

you don't want to use bucket btw. If for some reason there are hours with zero events, bucket will completely ignore those hours and so those zeros affect your average at all (and you need them to). Instead you want timechart. Timechart will make a full set of buckets for all the hours even if some hours had no events in them.

And using date_hour and date_wday would be tricky. Fortunately we don't need them. We can easily make our own with the strftime function in eval, just using the time values in the timechart output.

So run this search over the last 30 days or something.

| timechart span=1h count 
| eval weekday=strftime(_time,"%A") 
| eval hour=strftime(_time,"%H") 
| stats avg(count) by hour weekday

timechart will get the hourly counts for every hour in the timeperiod. Then eval will make two little fields on each row, "weekday" and "hour", with the appropriate values. Then finally stats just calculates the average count for each combination of hour and day.

View solution in original post

sideview
SplunkTrust
SplunkTrust

OK I understand.

you don't want to use bucket btw. If for some reason there are hours with zero events, bucket will completely ignore those hours and so those zeros affect your average at all (and you need them to). Instead you want timechart. Timechart will make a full set of buckets for all the hours even if some hours had no events in them.

And using date_hour and date_wday would be tricky. Fortunately we don't need them. We can easily make our own with the strftime function in eval, just using the time values in the timechart output.

So run this search over the last 30 days or something.

| timechart span=1h count 
| eval weekday=strftime(_time,"%A") 
| eval hour=strftime(_time,"%H") 
| stats avg(count) by hour weekday

timechart will get the hourly counts for every hour in the timeperiod. Then eval will make two little fields on each row, "weekday" and "hour", with the appropriate values. Then finally stats just calculates the average count for each combination of hour and day.

RVDowning
Contributor

Ok, did the following:

| timechart span=1h count
| eval weekday=strftime(_time,"%A")
| eval hour=strftime(_time,"%H")
| chart avg(count) by weekday, hour

I prefer the output that chart provided, and reversed the hour and weekday. Thanks for your help!

0 Karma

aljohnson_splun
Splunk Employee
Splunk Employee

Have you seen the timechart command's function per_hour() ?

... | timechart per_hour(field) span=1d
0 Karma

RVDowning
Contributor

For the time period selected, I would like the average number of transactions per hour per day of the week. So I would be able to see what the average number of transactions was for Wednesdays at 2:00 PM, or Fridays at 5:00 PM, etc.

0 Karma

gyslainlatsa
Motivator

hi
try this

 | bucket _time span=1h 
 | stats count  AS "Transaction Count" by date_hour, date_wday
 | stats avg(Transaction Count) AS "Average Transaction Count" 
0 Karma

RVDowning
Contributor

This just gives me one number, not a number by hour by weekday.

0 Karma

sideview
SplunkTrust
SplunkTrust

Can you clarify what you mean by the "average of them"? It doesn't really make any sense so I think you're missing some context. DO you want to end up wiht a single number representing average count per hour? Or one row per day, listing the hourly averages for each day?

Get Updates on the Splunk Community!

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...

Last Chance to Submit Your Paper For BSides Splunk - Deadline is August 12th!

Hello everyone! Don't wait to submit - The deadline is August 12th! We have truly missed the community so ...