Splunk Enterprise

How to create an alert that Detects anomalies or outliers in Splunk?

shashank_24
Path Finder

Hi, I have few alerts created which looks into failure rates of my services and I have put in a condition which says if the failure rate is > 10% AND number of failed request > 200 then trigger the alert.

This is really not the ideal way to do the monitoring. Is there a way in Splunk we can use the AI to detect anomalies or outliers over time? So basically if Splunk can detect a failure pattern and if that pattern is consistent then don't trigger an alert but if it goes beyond the threshold, only then trigger it?

Can we do this kind of stuff in Splunk using in-built  ML or AI?

Labels (2)
0 Karma

bowesmana
SplunkTrust
SplunkTrust

Take a look at the ML toolkit - there are some good examples on outliers there - you can also roll your own, e.g. this type of search will look for hourly outliers outside 3 * stdev

search error
| bin _time span=1m
| stats count by _time
| streamstats window=60 avg(count) as avg stdev(count) as stdev
| eval multiplier = 3
| eval lower_bound = avg - (stdev * multiplier)
| eval upper_bound = avg + (stdev * multiplier)
| eval outlier = if(count < lower_bound OR count > upper_bound, 1, 0)
| table _time count lower_bound upper_bound outlier
Results Example

 

0 Karma
Get Updates on the Splunk Community!

Improve Your Security Posture

Watch NowImprove Your Security PostureCustomers are at the center of everything we do at Splunk and security ...

Maximize the Value from Microsoft Defender with Splunk

 Watch NowJoin Splunk and Sens Consulting for this Security Edition Tech TalkWho should attend:  Security ...

This Week's Community Digest - Splunk Community Happenings [6.27.22]

Get the latest news and updates from the Splunk Community here! News From Splunk Answers ✍️ Splunk Answers is ...