- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Create alert when average events greater than 2 standard deviations from rolling average
I know that there are several threads on answers that reference alerts based on standard deviation. I have tried a few of them and the use cases do not seem to meet what I need.
I would like to create an alert that will fire when the average of events over 5 minutes is greater than 2 standard deviations of the average of events over 60 minutes. This post is the closest I have found, but I am still stuck.
Any assistance would be appreciated.
Thanks
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Hi Jodros, check out the Security Essentials app, which goes through a lot of statistical use cases like this:
Security Essentials App:
https://splunkbase.splunk.com/app/3435/
For example, you can start from the "Sources Sending a High Volume of DNS Traffic" use case in the Security Essentials app. This query identifies hosts with very high traffic (more than 3 standard standard deviations). You should be able to adapt this to your use case:
| inputlookup dns_data_anon.csv
| convert mktime(_time) timeformat="%Y-%m-%dT%H:%M:%S.%3Q%z"
| bucket _time span=1h
| stats sum(bytes*) as bytes* by src_ip _time
| eventstats max(_time) as maxtime avg(bytes_out) as avg_bytes_out stdev(bytes_out) as stdev_bytes_out
| eventstats count as num_data_samples avg(eval(if(_time < relative_time(maxtime, "@h"),bytes_out,null))) as per_source_avg_bytes_out stdev(eval(if(_time < relative_time(maxtime, "@h"),bytes_out,null))) as per_source_stdev_bytes_out by src_ip
| where num_data_samples >=4 AND bytes_out > avg_bytes_out + 3 * stdev_bytes_out AND bytes_out > per_source_avg_bytes_out + 3 * per_source_stdev_bytes_out AND _time >= relative_time(maxtime, "@h")
| eval num_standard_deviations_away_from_org_average = round(abs(bytes_out - avg_bytes_out) / stdev_bytes_out,2), num_standard_deviations_away_from_per_source_average = round(abs(bytes_out - per_source_avg_bytes_out) / per_source_stdev_bytes_out,2)
| fields - maxtime per_source* avg* stdev*
