I have a couple of alerts for License usage set to check every hour when they exceed 75 %. At the moment, I receive an email every hour after it reaches the threshold. I only want to be notified one time for that day. If I use the throttle option and suppress for 1 day, it will wait 24 hours until it alerts me again. This is an issue because if the threshold is exceeded earlier the following day, I won't get notified until that time runs out. I would like to suppress the alert until 12 am that night. Can anyone please help? Thank you
I posted a solution to this general problem here: https://answers.splunk.com/answers/337985/throttle-alert-once-per-day.html
In short, the search string I used to trigger once a day alerting when over the license limit is this:
| rest splunk_server=local /services/licenser/pools | rename title AS Pool | search [rest splunk_server=local /services/licenser/groups | search is_active=1 | eval stack_id=stack_ids | fields stack_id] | eval quota=if(isnull(effective_quota),quota,effective_quota) | eval "Percentage of daily license limit used"=round(used_bytes/quota*100,2) | eval "Alert time"=strftime(now(), "%T %Z") | eval alert_count_today=[search index=_internal sourcetype=scheduler thread_id=AlertNotifier* savedsearch_name="License Limit Exceeded: Over 100% Usage" earliest=@d | where alert_actions!="" | stats count | return($count)] | where 'Percentage of daily license limit used' > 100 and alert_count_today = 0 | fields "Alert time" "Percentage of daily license limit used"
For more details see the original post.
Try this as your alert search (no throttling needs to be set)
| rest /services/licenser/usage | eval "% used"=round(slaves_usage_bytes/quota*100,2) | appendcols [search index=_internal sourcetype=scheduler thread_id=AlertNotifier* savedsearch_name="PUTYOURALERTSEARCHNAMEHERE" earliest=@d | head 1 | table _time] | where '% used' > 75 AND isnull(_time)| fields "% used", "updated"
The appendcols subsearch will try to find the timestamp of an alert that was fired today. If it finds one, the alert will be be fired.