Alerting

How to search recent alerts fired by Splunk?

Justin_Grant
Contributor

I'd like to build a "Recent Alerts" report listing which alerts have been fired by Splunk in the last few days.

When a splunk alert is fired, I'm assuming there's an event written somewhere in the _internal index which I can use for this. Anyone know what search query I should be using to pull out these events from Splunk's internal logs?

BTW, this report will be useful for several reasons, including:

  • troubleshooting problems with our alert scripts (e.g. alert fired but alert script didn't do what we thought it should)
  • giving folks outside the ops team a view into the problems the ops team is working on
  • providing a failsafe option if email is down
2 Solutions

jrodman
Splunk Employee
Splunk Employee

4.0 doesn't have terribly good log events for alerting. You can see that the search was run, but not that it was run by the scheduler, so you cannot differentiate between manually-initiated and schedule-initiated searches. You can see the python event if the search eventually fires the email sending command sendemail.py, but that only will catch searches whose conditions were met, and which were configured to send email.

In 4.1, all scheduled searches are explicitly logged, as well as the result (conditions met / not met). If a search would have run but was not for some reason, this is also logged. There are some built-in status views that try to give useful reporting on this data, but you can build your own slicings of it.

View solution in original post

ftk
Motivator

I use the following search on the _internal index in version 4.1+ to report on alerts that have been triggered:

index="_internal" sourcetype="scheduler" thread_id="AlertNotifier*" NOT (alert_actions="summary_index" OR alert_actions="")

I am excluding summary_index alert actions since I am only interested in "real" alerts and not summary index searches. You can easily build a report based on the results of this search. Especially if you use splunk for PCI compliance having a report showing all alerts fired over a period of time will go a long way to help you satisfy the daily log review requirement.

View solution in original post

indut
Path Finder

hi, I have similar query but this is different. Is there any index/any place where all the events from the fired alerts are doing to?? I am not interested to know alert names/save search names. All that I want to know is, irrespective of what  the "alert action"  is I am assuming all fired events/fired results of alert are stored somewhere. I would like to know if there is any such index/sourcetype. Thanks in advance.

0 Karma

isoutamo
SplunkTrust
SplunkTrust
You can store at least some information with chose "Log Event" action for alarm. Then just define what and in which index you want to store it.
r. Ismo

indut
Path Finder

Thank you for the replu Soutamo. I have explored this option and I think this works good if every alert that is configured aligns to this method. Currently there are many alerts that are running and they are set up with different alert actions and hence this method will not help for the scheduled alerts.

I got response from splunk slack group and the response is as below:

jeffland 14 hours ago


When an alert runs, it lives for as long as its expiry setting allows it to. This data is pretty much a regular search job, so nothing you can "search" for with SPL as there is nothing that has been indexed (it lives on the disk of the search head running the search).
Additionally, if you configured alert actions, those run if the criteria were met. If you sent out an email, it will be in the recipients mail box. If you indexed something, you'll find it where it was configured through the alert action configuration.

 

When an alert runs, it lives for as long as its expiry setting allows it to. This data is pretty much a regular search job, so nothing you can "search" for with SPL as there is nothing that has been indexed (it lives on the disk of the search head running the search).
Additionally, if you configured alert actions, those run if the criteria were met. If you sent out an email, it will be in the recipients mail box. If you indexed something, you'll find it where it was configured through the alert action configuration.
0 Karma

alexiri
Communicator

You can also use the following search:

index=_audit action=alert_fired

which has the added benefit of giving you the expiration time and the severity. For example, you could create a report of the currently active alerts like this:

index=_audit action=alert_fired | eval ttl=expiration-now() | search ttl>0 | convert ctime(trigger_time) | table trigger_time ss_name severity

ftk
Motivator

I use the following search on the _internal index in version 4.1+ to report on alerts that have been triggered:

index="_internal" sourcetype="scheduler" thread_id="AlertNotifier*" NOT (alert_actions="summary_index" OR alert_actions="")

I am excluding summary_index alert actions since I am only interested in "real" alerts and not summary index searches. You can easily build a report based on the results of this search. Especially if you use splunk for PCI compliance having a report showing all alerts fired over a period of time will go a long way to help you satisfy the daily log review requirement.

jrodman
Splunk Employee
Splunk Employee

4.0 doesn't have terribly good log events for alerting. You can see that the search was run, but not that it was run by the scheduler, so you cannot differentiate between manually-initiated and schedule-initiated searches. You can see the python event if the search eventually fires the email sending command sendemail.py, but that only will catch searches whose conditions were met, and which were configured to send email.

In 4.1, all scheduled searches are explicitly logged, as well as the result (conditions met / not met). If a search would have run but was not for some reason, this is also logged. There are some built-in status views that try to give useful reporting on this data, but you can build your own slicings of it.

Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...