Alerting

How can I query to get all alerts which are configured?

maniraghavendra
New Member

Hi,

i wanted to export all the alert's which i have configured under search, reports and alerts via a splunk query?

Regards,
Mani

Labels (2)
Tags (1)
0 Karma
1 Solution

woodcock
Esteemed Legend

Like this:

ALL APPS:

|rest/servicesNS/-/-/saved/searches | search alert.track=1 | fields title description search disabled triggered_alert_count actions action.script.filename alert.severity cron_schedule

Search app only:

|rest/servicesNS/-/search/saved/searches | search alert.track=1 | fields title description search disabled triggered_alert_count actions action.script.filename alert.severity cron_schedule

View solution in original post

woodcock
Esteemed Legend

Be sure to click Accept on the best working answer to close the question.

woodcock
Esteemed Legend

Like this:

ALL APPS:

|rest/servicesNS/-/-/saved/searches | search alert.track=1 | fields title description search disabled triggered_alert_count actions action.script.filename alert.severity cron_schedule

Search app only:

|rest/servicesNS/-/search/saved/searches | search alert.track=1 | fields title description search disabled triggered_alert_count actions action.script.filename alert.severity cron_schedule

aokur_splunk
Splunk Employee
Splunk Employee

This isn't necessarily accurate - if for some reason alert.track has not been set, this will not return all results. You can search for this yourself by using the GUI counts vs the results of the searches above.

The workaround would be to narrow down the search results in a different way - most configured alerts will have at least one action associate with it so I used something along the lines of |rest/servicesNS/-/search/saved/searches | search actions!=""|<fields go here>

woodcock
Esteemed Legend

The question was to show all alerts, not all saved searches that have alert actions. My answer does the former, for sure.

aokur_splunk
Splunk Employee
Splunk Employee

It doesn't return all alerts however - alert.track is set to 1 by default but if someone changes it, or is set otherwise by an app, the query above does not return all alerts, alert action or not. This comment thread serves to inform users of the query above to be on the lookout for this scenario - it is not a guarantee that all configured alerts will be returned.

woodcock
Esteemed Legend

Incorrect. Originally only alerts had alert actions but customers insisted and now reports also can have alert actions so literally there is no functional difference between the two. There is now only a taxonomical difference which you are free to slice any way that you like. Settings-wise, the difference between the two now is defined in savedsearches.conf as: alert.track=1 means alert and alert.track=0 means report. That is it.

Smashley
Explorer

This is an older thread, but as I recently stumbled upon it and encountered some confusion, in the hopes it clarifies this topic a bit, here's my experience.

I've used the previously suggested search (ie, including '| alert.track=1' ) and found that as of this writing in (Splunk 9.0.4), alert.track=1 seems to mean that the 'action' of 'Add to Triggered Alerts' is enabled for that particular alert, and because that specific 'Add to Triggered Alerts' action isn't available for Reports, one can conclude it is in fact an Alert. Conversely, though, alert.track=0 isn't exclusive to Reports, and an Alert can use other actions aside from 'Add to Triggered Alerts', like email/slack/etc and in that case alert.track=0.

In fact, that 'Add to Triggered Alerts' action isn't listed in the 'actions' field in the search results, only alert.track=1. So to summarize alert.action=1 does explicitly mean Alert, but alert.action=0 does not exclude it from being an Alert.

Unsure if this functionality changed at some point in the years since this question was asked. Depending on one's interpretation of OP's question, alert.track value may or may not be relevant. In any case, thanks to all who responded as this has helped me a great deal in solving my own requirements.

curtismcginity
Explorer

I observe results contradictory to this. Specifically, I have a group of  `alerts' visible in the GUI at `~/app/search/alerts` and not visible in `~/app/search/reports`. The entire group has `alert.track=0`.

If these were created as `reports` with `alert` actions (in this case, email), then how/why does Splunk know to make these visible in `Alerts` and not `Reports`? If these are created as `alerts` and are visible in `Alerts`, then why does Splunk set `alert.track=0`?

 

(P.S. How did you accomplish that inline formatting in your response? Can't seem to make it work...)

0 Karma
Get Updates on the Splunk Community!

Earn a $35 Gift Card for Answering our Splunk Admins & App Developer Survey

Survey for Splunk Admins and App Developers is open now! | Earn a $35 gift card!      Hello there,  Splunk ...

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...