I would like to configure an alert that triggers every X increase in a count field Y.
To the user this would look like
"count is now at 1000!"
[15 mins goes by]
"count is now at 2000!"
[5 mins goes by]
"count is now at 3000!"
5 mins is my search interval.
I don't want an alert at a regular interval. I only want it to trigger if it is X greater than the last alert.
I realize there is a "number of results rises by" trigger condition, but my search outputs a table with a count field I would like to use instead of raw count of events.
Seems like a simple thing, but I can't figure how to do this without searching
scheduler for the last triggered alert which seems messy.
Please help! Much appreciated.
If you' re using at least Splunk version 7.0.0, you could use the "Output results to lookup" alert action. You would create a lookup with the initial threshold, and change the alert search query to use that lookup to get the threshold. Once the alert triggers, it would save the results in the lookup to be used the next time the search is scheduled to run.
The exact implementation would depend on what exactly you want to do.
Can you share sample output data.
if your sample data or query output something like as below then you can refer below query.
Sample Data :
earliest=-5m index="_internal" source="/opt/splunk/var/log/splunk/splunkd.log" | stats count as "Current_Count" by source | eval Last_alert_count = [search earliest=-10m latest=-5m index="_internal" source="/opt/splunk/var/log/splunk/splunkd.log" | stats count by source | return $count] | where Current_Count > Last_alert_count
also if your first column (x axis) has multiple values and same then you can try this.
earliest=-5m index="_internal" | stats count as "Current_Count" by source | join source [search earliest=-10m latest=-5m index="_internal" | stats count as last_count by source] | where Current_Count > last_count