Splunk Search

What command should I use to get the average of my entries by hour?

nelli_
Engager

Hi

I am new to Splunk so this little operation that would be simple in SQL seems to be real puzzling to me.

I get count per distinct entries by ... | stats count as counter by subtype, type, devname, date, date_hour

For example;

allowed     traffic     FortiWifiA  2016-09-12  10  1
allowed     traffic     FortiWifiA  2016-09-12  11  4
allowed     traffic     FortiWifiA  2016-09-12  12  3
allowed     traffic     FortiWifiB  2016-09-12  13  100

If I do ... | stats count as counter by subtype, type, devname, date, date_hour | stats avg(counter) by devname, type, subtype the result is;

FortiWifiA  traffic     allowed     1.333
FortiWifiB  traffic     allowed     100.000

But what I wanted is;

FortiWifiA  traffic     allowed     2.000
FortiWifiB  traffic     allowed     25.000

So sum of counter group by devname, type, subtype divided by total number of results. How do I do that?

BR,
Nelli

0 Karma
1 Solution

somesoni2
Revered Legend

Try like this

... | stats count as counter by subtype, type, devname, date, date_hour | eventstats dc(date_hour) as hours | eval counter=counter/hours | stats sum(counter) by devname, type, subtype

View solution in original post

somesoni2
Revered Legend

Try like this

... | stats count as counter by subtype, type, devname, date, date_hour | eventstats dc(date_hour) as hours | eval counter=counter/hours | stats sum(counter) by devname, type, subtype

nelli_
Engager

Thanks. This got me started on the fine tuning 🙂

Get Updates on the Splunk Community!

Enter the Agentic Era with Splunk AI Assistant for SPL 1.4

  🚀 Your data just got a serious AI upgrade — are you ready? Say hello to the Agentic Era with the ...

Feel the Splunk Love: Real Stories from Real Customers

Hello Splunk Community,    What’s the best part of hearing how our customers use Splunk? Easy: the positive ...

Data Management Digest – November 2025

  Welcome to the inaugural edition of Data Management Digest! As your trusted partner in data innovation, the ...