Alerting

Set alert severity based on field value condition?

matthewe25
Engager

I have some field value 'foo' and I want to trigger an alert of a different severity depending on its value (e.g a low severity alert when foo > 10, a medium severity alert when foo > 20, and a high severity alert when foo > 50). I know that this is easy to do by creating separate alerts and changing the count condition and severity for each one, but this feels inefficient and will be a pain to go through and edit when the conditions change.

I'm new to Splunk so I'm not aware of a way to trigger an alert with a condition for its severity. Is this possible to achieve?

Labels (2)
0 Karma
1 Solution

jacobpevans
Motivator

Greetings @matthewe25 ,

I know this isn't exactly what you want, but this might help you: https://community.splunk.com/t5/Alerting/Configure-an-alert-based-on-the-number-of-results-as-warnin...

As far as your actual question, I would test whether you can modify the savedsearches.conf file directly to try to set the severity to the same thing (e.g. $result.Criticality$) but try using numbers instead of words. It's a long shot, and it's definitely not documented, but that's the only way I can think of this working.

As you said in your post, this situation is normally handled by using separate alerts. One way to make them slightly more dynamic is to retrieve the thresholds from a lookup. If you do that, at least you won't have to touch the multiple alerts to modify the thresholds moving forward.

Cheers,
Jacob

If you feel this response answered your question, please do not forget to mark it as such. If it did not, but you do have the answer, feel free to answer your own post and accept that as the answer.

View solution in original post

jacobpevans
Motivator

Greetings @matthewe25 ,

I know this isn't exactly what you want, but this might help you: https://community.splunk.com/t5/Alerting/Configure-an-alert-based-on-the-number-of-results-as-warnin...

As far as your actual question, I would test whether you can modify the savedsearches.conf file directly to try to set the severity to the same thing (e.g. $result.Criticality$) but try using numbers instead of words. It's a long shot, and it's definitely not documented, but that's the only way I can think of this working.

As you said in your post, this situation is normally handled by using separate alerts. One way to make them slightly more dynamic is to retrieve the thresholds from a lookup. If you do that, at least you won't have to touch the multiple alerts to modify the thresholds moving forward.

Cheers,
Jacob

If you feel this response answered your question, please do not forget to mark it as such. If it did not, but you do have the answer, feel free to answer your own post and accept that as the answer.
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...

New in Observability Cloud - Explicit Bucket Histograms

Splunk introduces native support for histograms as a metric data type within Observability Cloud with Explicit ...