Hi,
i wanted to export all the alert's which i have configured under search, reports and alerts via a splunk query?
Regards,
Mani
Like this:
ALL APPS:
|rest/servicesNS/-/-/saved/searches | search alert.track=1 | fields title description search disabled triggered_alert_count actions action.script.filename alert.severity cron_schedule
Search app only:
|rest/servicesNS/-/search/saved/searches | search alert.track=1 | fields title description search disabled triggered_alert_count actions action.script.filename alert.severity cron_schedule
Be sure to click Accept
on the best working answer to close the question.
Like this:
ALL APPS:
|rest/servicesNS/-/-/saved/searches | search alert.track=1 | fields title description search disabled triggered_alert_count actions action.script.filename alert.severity cron_schedule
Search app only:
|rest/servicesNS/-/search/saved/searches | search alert.track=1 | fields title description search disabled triggered_alert_count actions action.script.filename alert.severity cron_schedule
This isn't necessarily accurate - if for some reason alert.track
has not been set, this will not return all results. You can search for this yourself by using the GUI counts vs the results of the searches above.
The workaround would be to narrow down the search results in a different way - most configured alerts will have at least one action associate with it so I used something along the lines of |rest/servicesNS/-/search/saved/searches | search actions!=""|<fields go here>
The question was to show all alerts
, not all saved searches that have alert actions
. My answer does the former, for sure.
It doesn't return all alerts however - alert.track
is set to 1 by default but if someone changes it, or is set otherwise by an app, the query above does not return all alerts, alert action or not. This comment thread serves to inform users of the query above to be on the lookout for this scenario - it is not a guarantee that all configured alerts will be returned.
Incorrect. Originally only alerts
had alert actions
but customers insisted and now reports
also can have alert actions
so literally there is no functional difference between the two. There is now only a taxonomical
difference which you are free to slice any way that you like. Settings-wise, the difference between the two now is defined in savedsearches.conf
as: alert.track=1
means alert
and alert.track=0
means report
. That is it.
This is an older thread, but as I recently stumbled upon it and encountered some confusion, in the hopes it clarifies this topic a bit, here's my experience.
I've used the previously suggested search (ie, including '| alert.track=1' ) and found that as of this writing in (Splunk 9.0.4), alert.track=1 seems to mean that the 'action' of 'Add to Triggered Alerts' is enabled for that particular alert, and because that specific 'Add to Triggered Alerts' action isn't available for Reports, one can conclude it is in fact an Alert. Conversely, though, alert.track=0 isn't exclusive to Reports, and an Alert can use other actions aside from 'Add to Triggered Alerts', like email/slack/etc and in that case alert.track=0.
In fact, that 'Add to Triggered Alerts' action isn't listed in the 'actions' field in the search results, only alert.track=1. So to summarize alert.action=1 does explicitly mean Alert, but alert.action=0 does not exclude it from being an Alert.
Unsure if this functionality changed at some point in the years since this question was asked. Depending on one's interpretation of OP's question, alert.track value may or may not be relevant. In any case, thanks to all who responded as this has helped me a great deal in solving my own requirements.
I observe results contradictory to this. Specifically, I have a group of `alerts' visible in the GUI at `~/app/search/alerts` and not visible in `~/app/search/reports`. The entire group has `alert.track=0`.
If these were created as `reports` with `alert` actions (in this case, email), then how/why does Splunk know to make these visible in `Alerts` and not `Reports`? If these are created as `alerts` and are visible in `Alerts`, then why does Splunk set `alert.track=0`?
(P.S. How did you accomplish that inline formatting in your response? Can't seem to make it work...)