Alerting

Set alert severity based on field value condition?

matthewe25
Engager

I have some field value 'foo' and I want to trigger an alert of a different severity depending on its value (e.g a low severity alert when foo > 10, a medium severity alert when foo > 20, and a high severity alert when foo > 50). I know that this is easy to do by creating separate alerts and changing the count condition and severity for each one, but this feels inefficient and will be a pain to go through and edit when the conditions change.

I'm new to Splunk so I'm not aware of a way to trigger an alert with a condition for its severity. Is this possible to achieve?

Labels (2)
0 Karma
1 Solution

jacobpevans
Motivator

Greetings @matthewe25 ,

I know this isn't exactly what you want, but this might help you: https://community.splunk.com/t5/Alerting/Configure-an-alert-based-on-the-number-of-results-as-warnin...

As far as your actual question, I would test whether you can modify the savedsearches.conf file directly to try to set the severity to the same thing (e.g. $result.Criticality$) but try using numbers instead of words. It's a long shot, and it's definitely not documented, but that's the only way I can think of this working.

As you said in your post, this situation is normally handled by using separate alerts. One way to make them slightly more dynamic is to retrieve the thresholds from a lookup. If you do that, at least you won't have to touch the multiple alerts to modify the thresholds moving forward.

Cheers,
Jacob

If you feel this response answered your question, please do not forget to mark it as such. If it did not, but you do have the answer, feel free to answer your own post and accept that as the answer.

View solution in original post

jacobpevans
Motivator

Greetings @matthewe25 ,

I know this isn't exactly what you want, but this might help you: https://community.splunk.com/t5/Alerting/Configure-an-alert-based-on-the-number-of-results-as-warnin...

As far as your actual question, I would test whether you can modify the savedsearches.conf file directly to try to set the severity to the same thing (e.g. $result.Criticality$) but try using numbers instead of words. It's a long shot, and it's definitely not documented, but that's the only way I can think of this working.

As you said in your post, this situation is normally handled by using separate alerts. One way to make them slightly more dynamic is to retrieve the thresholds from a lookup. If you do that, at least you won't have to touch the multiple alerts to modify the thresholds moving forward.

Cheers,
Jacob

If you feel this response answered your question, please do not forget to mark it as such. If it did not, but you do have the answer, feel free to answer your own post and accept that as the answer.
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In January, the Splunk Threat Research Team had one release of new security content via the Splunk ES Content ...

Expert Tips from Splunk Professional Services, Ensuring Compliance, and More New ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Observability Release Update: AI Assistant, AppD + Observability Cloud Integrations & ...

This month’s releases across the Splunk Observability portfolio deliver earlier detection and faster ...