Alerting

Set alert severity based on field value condition?

matthewe25
Engager

I have some field value 'foo' and I want to trigger an alert of a different severity depending on its value (e.g a low severity alert when foo > 10, a medium severity alert when foo > 20, and a high severity alert when foo > 50). I know that this is easy to do by creating separate alerts and changing the count condition and severity for each one, but this feels inefficient and will be a pain to go through and edit when the conditions change.

I'm new to Splunk so I'm not aware of a way to trigger an alert with a condition for its severity. Is this possible to achieve?

Labels (2)
0 Karma
1 Solution

jacobpevans
Motivator

Greetings @matthewe25 ,

I know this isn't exactly what you want, but this might help you: https://community.splunk.com/t5/Alerting/Configure-an-alert-based-on-the-number-of-results-as-warnin...

As far as your actual question, I would test whether you can modify the savedsearches.conf file directly to try to set the severity to the same thing (e.g. $result.Criticality$) but try using numbers instead of words. It's a long shot, and it's definitely not documented, but that's the only way I can think of this working.

As you said in your post, this situation is normally handled by using separate alerts. One way to make them slightly more dynamic is to retrieve the thresholds from a lookup. If you do that, at least you won't have to touch the multiple alerts to modify the thresholds moving forward.

Cheers,
Jacob

If you feel this response answered your question, please do not forget to mark it as such. If it did not, but you do have the answer, feel free to answer your own post and accept that as the answer.

View solution in original post

jacobpevans
Motivator

Greetings @matthewe25 ,

I know this isn't exactly what you want, but this might help you: https://community.splunk.com/t5/Alerting/Configure-an-alert-based-on-the-number-of-results-as-warnin...

As far as your actual question, I would test whether you can modify the savedsearches.conf file directly to try to set the severity to the same thing (e.g. $result.Criticality$) but try using numbers instead of words. It's a long shot, and it's definitely not documented, but that's the only way I can think of this working.

As you said in your post, this situation is normally handled by using separate alerts. One way to make them slightly more dynamic is to retrieve the thresholds from a lookup. If you do that, at least you won't have to touch the multiple alerts to modify the thresholds moving forward.

Cheers,
Jacob

If you feel this response answered your question, please do not forget to mark it as such. If it did not, but you do have the answer, feel free to answer your own post and accept that as the answer.
Get Updates on the Splunk Community!

Accelerating Observability as Code with the Splunk AI Assistant

We’ve seen in previous posts what Observability as Code (OaC) is and how it’s now essential for managing ...

Integrating Splunk Search API and Quarto to Create Reproducible Investigation ...

 Splunk is More Than Just the Web Console For Digital Forensics and Incident Response (DFIR) practitioners, ...

Congratulations to the 2025-2026 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...