Reporting

How do I create an alert to trigger at discrete intervals?

neiljpeterson
Communicator

I would like to configure an alert that triggers every X increase in a count field Y.

To the user this would look like

"count is now at 1000!"

[15 mins goes by]

"count is now at 2000!"

[5 mins goes by]

"count is now at 3000!"

5 mins is my search interval.

I don't want an alert at a regular interval. I only want it to trigger if it is X greater than the last alert.

I realize there is a "number of results rises by" trigger condition, but my search outputs a table with a count field I would like to use instead of raw count of events.

Seems like a simple thing, but I can't figure how to do this without searching _internal and scheduler for the last triggered alert which seems messy.

Please help! Much appreciated.

thanks

Neil

0 Karma

richgalloway
SplunkTrust
SplunkTrust

@neiljpeterson Is your problem resolved? If so, please accept an answer.

---
If this reply helps you, Karma would be appreciated.
0 Karma

janispelss
Path Finder

If you' re using at least Splunk version 7.0.0, you could use the "Output results to lookup" alert action. You would create a lookup with the initial threshold, and change the alert search query to use that lookup to get the threshold. Once the alert triggers, it would save the results in the lookup to be used the next time the search is scheduled to run.
The exact implementation would depend on what exactly you want to do.

0 Karma

anjambha
Communicator

Hi neiljpeterson,

Can you share sample output data.

if your sample data or query output something like as below then you can refer below query.

Sample Data :

source,count
/opt/splunk/var/log/splunk/splunkd.log,7

Query:

earliest=-5m index="_internal" source="/opt/splunk/var/log/splunk/splunkd.log" | stats count as "Current_Count" by source | eval Last_alert_count = [search earliest=-10m latest=-5m index="_internal" source="/opt/splunk/var/log/splunk/splunkd.log" | stats count by source | return $count] | where Current_Count > Last_alert_count

also if your first column (x axis) has multiple values and same then you can try this.

earliest=-5m index="_internal" | stats count as "Current_Count" by source | join source [search earliest=-10m latest=-5m index="_internal"  | stats count as last_count by source] | where Current_Count > last_count
0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Introduction to Splunk AI

How are you using AI in Splunk? Whether you see AI as a threat or opportunity, AI is here to stay. Lucky for ...

Splunk + ThousandEyes: Correlate frontend, app, and network data to troubleshoot ...

Are you tired of troubleshooting delays caused by siloed frontend, application, and network data? We've got a ...

Maximizing the Value of Splunk ES 8.x

Splunk Enterprise Security (ES) continues to be a leader in the Gartner Magic Quadrant, reflecting its pivotal ...