I have set up alerting for my app such that it emails the user whenever the count or volume for today is outside of a range (+/- 25%) of the average. The alert is configured to send an email and to show up in the "Triggered Alerts" dashboard.
My users would like to have a panel integrated in to their dashboard that shows the alerts that have fired in the past 24 hours. However, they want to see more detail than I can retrieve from the audit log. I'd like to be able to display the results from the alert that fired rather than just the fact of firing.
My thought was to do index=_audit action=alert_fired
and then for each "sid" do a loadjob to get the results.
index=_audit action=alert_fired ss_app=br | map search="|loadjob $sid$"
This does return the result fields (yay!), but I need to be able to associate the results to which alert generated the results so that I can access trigger_time, expiration, and a few other fields in the audit record.
My end goal would be to have something like:
triggered_date, site, server, status, severity
2015-12-07, siteA, serverB, low volume, MEDIUM
2015-12-07, siteA, server C, high count, MEDIUM
Try REST API:
|rest/services/alerts/fired_alerts splunk_server=local| table eai:acl.owner eai:acl.app id title triggered_alert_count
I am using a summary index to tackle this. So whenever an alert is sent, in the alert actions, I enable summary index and then you can create a dashboard from this summary index for the users. Data goes to summary index is not counted against license meter.
We you able to include the fields like severity, triggertime, expiration in the summary index somehow? I tried this and I get my results, but no information about the alert that was triggered.
We are capturing all these information in the search itself and writing to summary index at the same time. For eg: eval triggered_time=now() |eval severity=1 etc.