Reporting

How do I create an alert to trigger at discrete intervals?

neiljpeterson
Communicator

I would like to configure an alert that triggers every X increase in a count field Y.

To the user this would look like

"count is now at 1000!"

[15 mins goes by]

"count is now at 2000!"

[5 mins goes by]

"count is now at 3000!"

5 mins is my search interval.

I don't want an alert at a regular interval. I only want it to trigger if it is X greater than the last alert.

I realize there is a "number of results rises by" trigger condition, but my search outputs a table with a count field I would like to use instead of raw count of events.

Seems like a simple thing, but I can't figure how to do this without searching _internal and scheduler for the last triggered alert which seems messy.

Please help! Much appreciated.

thanks

Neil

0 Karma

richgalloway
SplunkTrust
SplunkTrust

@neiljpeterson Is your problem resolved? If so, please accept an answer.

---
If this reply helps you, Karma would be appreciated.
0 Karma

janispelss
Path Finder

If you' re using at least Splunk version 7.0.0, you could use the "Output results to lookup" alert action. You would create a lookup with the initial threshold, and change the alert search query to use that lookup to get the threshold. Once the alert triggers, it would save the results in the lookup to be used the next time the search is scheduled to run.
The exact implementation would depend on what exactly you want to do.

0 Karma

anjambha
Communicator

Hi neiljpeterson,

Can you share sample output data.

if your sample data or query output something like as below then you can refer below query.

Sample Data :

source,count
/opt/splunk/var/log/splunk/splunkd.log,7

Query:

earliest=-5m index="_internal" source="/opt/splunk/var/log/splunk/splunkd.log" | stats count as "Current_Count" by source | eval Last_alert_count = [search earliest=-10m latest=-5m index="_internal" source="/opt/splunk/var/log/splunk/splunkd.log" | stats count by source | return $count] | where Current_Count > Last_alert_count

also if your first column (x axis) has multiple values and same then you can try this.

earliest=-5m index="_internal" | stats count as "Current_Count" by source | join source [search earliest=-10m latest=-5m index="_internal"  | stats count as last_count by source] | where Current_Count > last_count
0 Karma
Get Updates on the Splunk Community!

The Splunk Success Framework: Your Guide to Successful Splunk Implementations

Splunk Lantern is a customer success center that provides advice from Splunk experts on valuable data ...

Splunk Training for All: Meet Aspiring Cybersecurity Analyst, Marc Alicea

Splunk Education believes in the value of training and certification in today’s rapidly-changing data-driven ...

Investigate Security and Threat Detection with VirusTotal and Splunk Integration

As security threats and their complexities surge, security analysts deal with increased challenges and ...