We have some searches on a dashboard that work way too long as they include several subsearches and calculate data for the latest 30 days, that lead daily scheduled pdf of that dashboard not to send pdf by email with pdf-sending error.
We splitted heavy searches into several ones but from day to day it may take different time to complete each of them.
Is there a way to set up saved searches to run in sequence not only by setting a sparse schedule for them?
this might help:
1) setup a KV store which has a counter increment in one of the column
col names eg: key, counter (number), startbasesearchtime (time), completefinalsearch_time (time)
2) save each search as a scheduled search EG: [SS0, SS1, SS2 ...] and set all but SS0 to scheduled at midnight (they will fail by default)
3) set SS0 to execute at the time you would like it to run and upon completion of the execution, increment the counter by 1 and add the time in the startbasesearchtime fields. SS0 should also be run as a case search on a cron schedule of every 30 mins to 1 hour:
| inputlookup searchscheduler
| eval currkey = _key
| sort - currkey
| head 1
| eval temp = case(counter = 0, [ | savedsearch
| eval startbasesearchtime = if(counter = 0, temp, startbasesearchtime)
| eval finishsearchtime = if(counter = N, temp, finishsearchtime)
| eval counter = counter + 1
| table counter startbasesearchtime finishsearchtime
| outputlookup append=true keyfield=curr_key
by the time you get to the last search, this should have completed all search execution, if there was any failure, it will re-execute the search in next run as the counter wouldn't be incremented.
Note: you will also have the audit trail, as all executions are being written to the KV store.