I have configured my Splunk alert as shown below. When my alert condition is triggered, I get 2 email notifications sent instead of just one. Any idea why this is? I have configured my search to run every minute using a cron expression, and to check for my criteria in the last 1 minute, and to trigger once for the search criteria so I don't know why I get multiple emails for the same alert. I would like to get only one email notification sent when Number of Sources > 0.
Any help would be appreciate. Thanks
Last 1 min time range, your time range can potentially be spread over two mins (it's basically -1m@m [end of previous min] to now() [current time]), so based on what time the search is running, your alert condition could be met for two consecutive alert search. Also, running every minute could be an overkill (unless this is super urgent). So here is what I would suggest
1) Change the alert frequency to say every 5 mins. It's fast enough.
2) Optionally, Instead of starting the cron at 0th min, start at some different minute. So,
*/5 * * * * will run the search every 5 min, starting with minute 0, then 5, then 10.. and so on. So many basic searches follows this and thus causes congestion during that minute. So lets start at say 2rd min. i.e.
2-59/5 * * * *. We'll adjust the time range as well accordingly.
3) Allow some time for data to be ingested and become searchable (there is few moment of delay between data being monitored, parsed, indexed and made searchable. So allowing a delay in the time range would ensure you're searching all data. With cron of
2-59/5 * * * *, you can use time range with earliest/Start time as
-7m@m and latest/Finish time as
-2m@m. This way your search is searching for data worth 5 min, but going back 2 mins, allowing 2 min delay for indexing process to be completed.
I tried using the cron of
2-59/5 * * * * and
-2m@m(latest) for the time range as you said but I still get 2 emails sent (1 email when I forward the file and then another email 5 minutes later).