Hi,
How do we list out all of the saved scheduled jobs on a Splunk setup by user, by day, by search, by tittle of the saved search?
Also, I wanted to plot in a days view of scheduled jobs -- ie.. 0-23 HRS scale with 1HR, all the jobs for a given user.
Basically, I wanted to know how many are scheduled at the same time in a given day and by users.
I've the following REST call:
| rest /services/saved/searches | stats values(search) as QueryName, values(title) as JobName, values(cron_schedule) as ScheduledAt by eai:acl.owner, title, cron_schedule
But, not sure how to transform cron-expression into a time to plot.
thanks..
Execution details of every scheduled saved search is stored in _internal index under scheduler sourcetype. You might want to have a look at that as well.
Sample query:
index=_internal sourcetype=scheduler status=success | table user, app, savedsearch_name, scheduled_time, sid, result_count
This will give you list of all saved search execution with fields like user context, app context, search name, time when it got executed (in epoch format, can be used for timechart), no of rows returned etc.
Execution details of every scheduled saved search is stored in _internal index under scheduler sourcetype. You might want to have a look at that as well.
Sample query:
index=_internal sourcetype=scheduler status=success | table user, app, savedsearch_name, scheduled_time, sid, result_count
This will give you list of all saved search execution with fields like user context, app context, search name, time when it got executed (in epoch format, can be used for timechart), no of rows returned etc.
Thanks. Seems like Btool is the one I can use to get what we want. I'm thinking, we can scrub Btool output to generate scheduled job data and then feed that back to Splunk to get plots we wanted. Thanks for the command.
Maybe you could play around with the btool command such that you run "splunk btool savedsearches list" and then parse the output from that. Then it would be getting the data you need from the saved configuration files vs. the historical logs.
Thanks. But your answers takes me into history or past..like how many got executed last hour, last day, 2 days ago..etc. But, I wanted to get to future view .. like how many are getting scheduled tomorrow.. this week, next week .. by all of the users as of today?
Where does cron regex for scheduled, enabled get stored? which index?
Also, how to create an alert say, if more than N jobs get scheduled at the same time?
Do you need this to be a REST call? I mean we have an app called SoS that fits the bill.
http://apps.splunk.com/app/748/
This app goes with:
A starter might be: | rest /services/search/jobs | table custom.dispatch.earliest_time custom.dispatch.latest_time custom.search | search custom.search!="| rest*"
Any options other than btool?
Thanks for the pointer. I'm aware of SOS. But, don't know how to do this with SOS.
What is the name of the index where SOS will store the schedule job 's cron information.
For example, if a job is scheduled to run -- once every week -- by user A -- and another job by user B -- to run say every day.. my plot should project all the jobs that going to be run this week.. Basically I wanted to see how many are going to flood the system, by running around the same time. Some kind of visualization .... Basically Future look of jobs going to run this week, next week etc.. etc.