I am running a search that returns all the failed logins across all servers that occurred in the last 15 minutes. It runs every 15 minutes and I want it to alert out if the failed logins is greater than 3 for a SINGLE server. So I don't want my threshold to be 3, but 3 for a specific server. For instance, I don't want an alert if SERVER1, SERVER2, & SERVER3 have all had a one failed login, but if SERVER1 has 3 failed logins, then I want an alert generated.
How can I do this when setting up the scheduled search?
Thanks, I am pretty new to Splunk still and am in process of setting up the various alerts that we need for our infrastructure.
You can add a custom alerting condition when you save the scheduled search. Assuming that the data in the scheduled search are all the failed login events, say from the search
eventtype=failed_login. The alert condition would look like:
stats count by host | search count > 3
This condition will aggregate the number of failures for each host, and filter out the hosts that have fewer than three failures. The alert will trigger if the output of the original search concatenated with the alert condition yields at least one result, which it will if any host has more than three failures.
Hey Kevin, you didn't indicate whether you want the alert to happen if the three or more failed logins are coming from the same user.
In case you don't care about the user use this search:
"failed login search" | stats count by host | where count >= 2
And create an alert with an alert condition "number of events" "is greater than" 0.
If you do care on a per user basis, use this search:
"failed login search" | stats count by username, host | where count >=2
And create an alert with an alert condition "number of events" "is greater than" 0.\
Naturally you will have to replace "failed login search" with your search for failed login events.