Alerting

What is the easiest way to send an alert when another alert's trigger condition has cleared?

Splunk Employee
Splunk Employee

What would be the easiest way to send an alert when another alert's trigger condition has cleared? Say the original alert was triggered when a count of events equaled zero for a five minute period. How would you send an alert(update) when the count went greater than zero AFTER the initial alert triggered?

1 Solution

Splunk Employee
Splunk Employee

Basically, you can do a count of events that have happened since the last alert was triggered. If the count is greater than zero your in business! Here's the search:

$search_string_to isolate_your_dataset$ latest=now [search index="_internal" sourcetype="scheduler" thread_id="AlertNotifier*" savedsearch_name=$your_alert$ | stats first(_time) as earliest | eval earliest = strftime(earliest, "%m/%d/%Y:%H:%M:%S")] | stats count | where count > 0

Replace $search_string_to_isolate_your_dataset$ with your actual search that will be used to evaluate if events were indexed. Replace $your_alert$ with your alert that originally fired.

Hope this helps.

View solution in original post

Splunk Employee
Splunk Employee

Basically, you can do a count of events that have happened since the last alert was triggered. If the count is greater than zero your in business! Here's the search:

$search_string_to isolate_your_dataset$ latest=now [search index="_internal" sourcetype="scheduler" thread_id="AlertNotifier*" savedsearch_name=$your_alert$ | stats first(_time) as earliest | eval earliest = strftime(earliest, "%m/%d/%Y:%H:%M:%S")] | stats count | where count > 0

Replace $search_string_to_isolate_your_dataset$ with your actual search that will be used to evaluate if events were indexed. Replace $your_alert$ with your alert that originally fired.

Hope this helps.

View solution in original post