Alerting

Custom Trigger Condition (Percent increase)

summitsplunk
Communicator

If I wanted to add a "custom" trigger condition to an alert that would trigger the alert only if the search results increase by x percent over 1 hour. How would I go about doing this?

Tags (1)
0 Karma
1 Solution

adonio
Ultra Champion

hello there,

will recommend to create a full search that will capture your condition and alert only on "true" or "false" instead of writing the condition in the custom trigger alert.
many answers and examples in this portal, here are links to some:
https://answers.splunk.com/answers/226668/alert-when-there-is-a-x-increase-in-all-events-dur.html
https://answers.splunk.com/answers/8156/rapid-growth-of-values-then-alerting-on-it.html
https://answers.splunk.com/answers/526032/calculate-percentage-increasedecrease-of-indexing.html

hope it helps

View solution in original post

adonio
Ultra Champion

hello there,

will recommend to create a full search that will capture your condition and alert only on "true" or "false" instead of writing the condition in the custom trigger alert.
many answers and examples in this portal, here are links to some:
https://answers.splunk.com/answers/226668/alert-when-there-is-a-x-increase-in-all-events-dur.html
https://answers.splunk.com/answers/8156/rapid-growth-of-values-then-alerting-on-it.html
https://answers.splunk.com/answers/526032/calculate-percentage-increasedecrease-of-indexing.html

hope it helps

Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Splunk AI Assistant for SPL vs. ChatGPT: Which One is Better?

In the age of AI, every tool promises to make our lives easier. From summarizing content to writing code, ...

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Thanks for the Memories! Splunk University, .conf25, and our Community

Thank you to everyone in the Splunk Community who joined us for .conf25, which kicked off with our iconic ...