Alerting

Set alert severity based on field value condition?

matthewe25
Engager

I have some field value 'foo' and I want to trigger an alert of a different severity depending on its value (e.g a low severity alert when foo > 10, a medium severity alert when foo > 20, and a high severity alert when foo > 50). I know that this is easy to do by creating separate alerts and changing the count condition and severity for each one, but this feels inefficient and will be a pain to go through and edit when the conditions change.

I'm new to Splunk so I'm not aware of a way to trigger an alert with a condition for its severity. Is this possible to achieve?

Labels (2)
0 Karma
1 Solution

jacobpevans
Motivator

Greetings @matthewe25 ,

I know this isn't exactly what you want, but this might help you: https://community.splunk.com/t5/Alerting/Configure-an-alert-based-on-the-number-of-results-as-warnin...

As far as your actual question, I would test whether you can modify the savedsearches.conf file directly to try to set the severity to the same thing (e.g. $result.Criticality$) but try using numbers instead of words. It's a long shot, and it's definitely not documented, but that's the only way I can think of this working.

As you said in your post, this situation is normally handled by using separate alerts. One way to make them slightly more dynamic is to retrieve the thresholds from a lookup. If you do that, at least you won't have to touch the multiple alerts to modify the thresholds moving forward.

Cheers,
Jacob

If you feel this response answered your question, please do not forget to mark it as such. If it did not, but you do have the answer, feel free to answer your own post and accept that as the answer.

View solution in original post

jacobpevans
Motivator

Greetings @matthewe25 ,

I know this isn't exactly what you want, but this might help you: https://community.splunk.com/t5/Alerting/Configure-an-alert-based-on-the-number-of-results-as-warnin...

As far as your actual question, I would test whether you can modify the savedsearches.conf file directly to try to set the severity to the same thing (e.g. $result.Criticality$) but try using numbers instead of words. It's a long shot, and it's definitely not documented, but that's the only way I can think of this working.

As you said in your post, this situation is normally handled by using separate alerts. One way to make them slightly more dynamic is to retrieve the thresholds from a lookup. If you do that, at least you won't have to touch the multiple alerts to modify the thresholds moving forward.

Cheers,
Jacob

If you feel this response answered your question, please do not forget to mark it as such. If it did not, but you do have the answer, feel free to answer your own post and accept that as the answer.
Get Updates on the Splunk Community!

Technical Workshop Series: Splunk Data Management and SPL2 | Register here!

Hey, Splunk Community! Ready to take your data management skills to the next level? Join us for a 3-part ...

Spotting Financial Fraud in the Haystack: A Guide to Behavioral Analytics with Splunk

In today's digital financial ecosystem, security teams face an unprecedented challenge. The sheer volume of ...

Solve Problems Faster with New, Smarter AI and Integrations in Splunk Observability

Solve Problems Faster with New, Smarter AI and Integrations in Splunk Observability As businesses scale ...