We have some searches on a dashboard that work way too long as they include several subsearches and calculate data for the latest 30 days, that lead daily scheduled pdf of that dashboard not to send pdf by email with pdf-sending error.
We splitted heavy searches into several ones but from day to day it may take different time to complete each of them.
Is there a way to set up saved searches to run in sequence not only by setting a sparse schedule for them?
Seems like this haven't been implemented yet. There is an idea posted here about it Scheduled reports which are triggered by a different scheduled | Ideas (splunk.com).
Give it a vote if you read this question and need this as well.
this might help:
1) setup a KV store which has a counter increment in one of the column
col names eg: _key, counter (number), start_base_search_time (time), complete_final_search_time (time)
2) save each search as a scheduled search EG: [SS0, SS1, SS2 ...] and set all but SS0 to scheduled at midnight (they will fail by default)
3) set SS0 to execute at the time you would like it to run and upon completion of the execution, increment the counter by 1 and add the time in the start_base_search_time fields. SS0 should also be run as a case search on a cron schedule of every 30 mins to 1 hour:
| inputlookup search_scheduler
| eval curr_key = _key
| sort - curr_key
| head 1
| eval temp = case(counter = 0, [ | savedsearch
| eval start_base_search_time = if(counter = 0, temp, start_base_search_time)
| eval finish_search_time = if(counter = N, temp, finish_search_time)
| eval counter = counter + 1
| table counter start_base_search_time finish_search_time
| outputlookup append=true key_field=curr_key
by the time you get to the last search, this should have completed all search execution, if there was any failure, it will re-execute the search in next run as the counter wouldn't be incremented.
Note: you will also have the audit trail, as all executions are being written to the KV store.