Alerting

Set alert severity based on field value condition?

matthewe25
Engager

I have some field value 'foo' and I want to trigger an alert of a different severity depending on its value (e.g a low severity alert when foo > 10, a medium severity alert when foo > 20, and a high severity alert when foo > 50). I know that this is easy to do by creating separate alerts and changing the count condition and severity for each one, but this feels inefficient and will be a pain to go through and edit when the conditions change.

I'm new to Splunk so I'm not aware of a way to trigger an alert with a condition for its severity. Is this possible to achieve?

Labels (2)
0 Karma
1 Solution

jacobpevans
Motivator

Greetings @matthewe25 ,

I know this isn't exactly what you want, but this might help you: https://community.splunk.com/t5/Alerting/Configure-an-alert-based-on-the-number-of-results-as-warnin...

As far as your actual question, I would test whether you can modify the savedsearches.conf file directly to try to set the severity to the same thing (e.g. $result.Criticality$) but try using numbers instead of words. It's a long shot, and it's definitely not documented, but that's the only way I can think of this working.

As you said in your post, this situation is normally handled by using separate alerts. One way to make them slightly more dynamic is to retrieve the thresholds from a lookup. If you do that, at least you won't have to touch the multiple alerts to modify the thresholds moving forward.

Cheers,
Jacob

If you feel this response answered your question, please do not forget to mark it as such. If it did not, but you do have the answer, feel free to answer your own post and accept that as the answer.

View solution in original post

jacobpevans
Motivator

Greetings @matthewe25 ,

I know this isn't exactly what you want, but this might help you: https://community.splunk.com/t5/Alerting/Configure-an-alert-based-on-the-number-of-results-as-warnin...

As far as your actual question, I would test whether you can modify the savedsearches.conf file directly to try to set the severity to the same thing (e.g. $result.Criticality$) but try using numbers instead of words. It's a long shot, and it's definitely not documented, but that's the only way I can think of this working.

As you said in your post, this situation is normally handled by using separate alerts. One way to make them slightly more dynamic is to retrieve the thresholds from a lookup. If you do that, at least you won't have to touch the multiple alerts to modify the thresholds moving forward.

Cheers,
Jacob

If you feel this response answered your question, please do not forget to mark it as such. If it did not, but you do have the answer, feel free to answer your own post and accept that as the answer.
Get Updates on the Splunk Community!

Now Available: Cisco Talos Threat Intelligence Integrations for Splunk Security Cloud ...

At .conf24, we shared that we were in the process of integrating Cisco Talos threat intelligence into Splunk ...

Preparing your Splunk Environment for OpenSSL3

The Splunk platform will transition to OpenSSL version 3 in a future release. Actions are required to prepare ...

Easily Improve Agent Saturation with the Splunk Add-on for OpenTelemetry Collector

Agent Saturation What and Whys In application performance monitoring, saturation is defined as the total load ...