Alerting

How to send an alert for alert resolution?

giga444
Engager

I want to send an alert when a situation has been corrected. for example If i setup an alert for low diskspace on a host and I set this alert up to check every 15 minutes, with range back 15 minutes. I will get an alert until I correct the low disk issue on the host. but what I want is that when the problem is corrected, splunk somehow would know that it needs to send an "Alert resolved" out. The problem is I only want it to send that "Alert resolved" message out only after an alert was sent out that it was a problem.

I would think that somehow a flag for an alert would need to be set when an alert is thrown so that when the condition is corrected, Splunk has a way to know that the previously it was a problem and it would send the "Alert resolved" alert

Anyone know of a way to do this?
Gary

 

Labels (1)
Tags (1)
0 Karma

JacekF
Path Finder

You haven't provided much details about your data but in general you can try something like that:

| stats min(disk_usage) as min_usage max(disk_usage) as max_usage 
| where min_usage < threshold and max_usage > threshold

This needs to be run for a time range which will cover at least 2 last disk usage samples.

I assume that you already have an alert to be triggered when disk_usage is above the threshold

0 Karma

giga444
Engager

It's the alert that goes out after the alert threshold is breached but then recovers for whatever reason.  For example... I have an alert that triggers if my disk space drops below 10% and a text is sent out. if a host then does drop below the 10%, Splunk does send an alert out that the the diskspace is below 10%. At that time someone from our team logs on and adds diskspace that bringing the diskspace back up to 20%. The alert stops triggering. All this works fine. 

what  I want is (more of a notification than an alert) to be sent out that the diskspace issue has been resolved. 

0 Karma

JacekF
Path Finder

Then it looks like you need a query with a time range covering last two samples, and then compare latest value with the earliest. If latest value is below the threshold and earliest is above it, it means that there is more free space available and an alert (notification) can be sent.

Or maybe I completely misunderstood your case?

giga444
Engager

JacekF, 
Brilliant! how creative. I had not thought of doing it that way. 

Thank you

Gary

0 Karma
Get Updates on the Splunk Community!

Introduction to Splunk Observability Cloud - Building a Resilient Hybrid Cloud

Introduction to Splunk Observability Cloud - Building a Resilient Hybrid Cloud  In today’s fast-paced digital ...

Observability protocols to know about

Observability protocols define the specifications or formats for collecting, encoding, transporting, and ...

Take Your Breath Away with Splunk Risk-Based Alerting (RBA)

WATCH NOW!The Splunk Guide to Risk-Based Alerting is here to empower your SOC like never before. Join Haylee ...