I'd like to run a search that would give me the metadata for all reports\alerts (creator, app, schedule, etc.) so that I can view all of this information on a single page. Is this possible? The main goal of this is to look at all scheduled report\alert times and see if there's any time where the reports\alerts are running more. So to be clear, I'm not after fired alerts, I'm after the schedule (though I suppose I could use the times on the fired alerts... hmm).
Any how, I'm open to suggestions and discussion. 😄
If your prime purpose is to identify times where you've too many searches running, you could use internal log of the splunk scheduler. The base query will be this
index=_internal sourcetype=scheduler
The logs will have important fields like scheduled_time (time when the scheduled report/alert was scheduled to run), dispatch_time (time when it actually started to run) and status (status of the report/alert). You can then run necessary aggregation command to find the report you want.
If you just want to list all searches/reports, you can use | rest
command to do so, which gives all the metadata about the search.
| rest splunk_server=local /servicesNS/-/-/saved/searches
If your prime purpose is to identify times where you've too many searches running, you could use internal log of the splunk scheduler. The base query will be this
index=_internal sourcetype=scheduler
The logs will have important fields like scheduled_time (time when the scheduled report/alert was scheduled to run), dispatch_time (time when it actually started to run) and status (status of the report/alert). You can then run necessary aggregation command to find the report you want.
If you just want to list all searches/reports, you can use | rest
command to do so, which gives all the metadata about the search.
| rest splunk_server=local /servicesNS/-/-/saved/searches
From the result set returned by the rest api call, is there a way to identify which are alerts and which are reports?
I think I can work with that. Thank you!