Splunk Search

How to setup scheduled search to run after fulfillment of another?

iKate
Builder

Hi!
We have some searches on a dashboard that work way too long as they include several subsearches and calculate data for the latest 30 days, that lead daily scheduled pdf of that dashboard not to send pdf by email with pdf-sending error.
We splitted heavy searches into several ones but from day to day it may take different time to complete each of them.
Is there a way to set up saved searches to run in sequence not only by setting a sparse schedule for them?

thilles
Explorer

Seems like this haven't been implemented yet. There is an idea posted here about it Scheduled reports which are triggered by a different scheduled | Ideas (splunk.com).
Give it a vote if you read this question and need this as well.

0 Karma

manjunathmeti
SplunkTrust
SplunkTrust

You can write a custom alert action to call next saved search.

Check this:
https://docs.splunk.com/Documentation/Splunk/8.0.2/AdvancedDev/ModAlertsIntro

0 Karma

iKate
Builder

Thanks) Hope it will be implemented as an inbuilt feature sooner or later.

0 Karma

to4kawa
Ultra Champion

token can handle sequence. but your problem is not this, I think.

0 Karma

anmolpatel
Builder

this might help:
1) setup a KV store which has a counter increment in one of the column
col names eg: _key, counter (number), start_base_search_time (time), complete_final_search_time (time)

2) save each search as a scheduled search EG: [SS0, SS1, SS2 ...] and set all but SS0 to scheduled at midnight (they will fail by default)

3) set SS0 to execute at the time you would like it to run and upon completion of the execution, increment the counter by 1 and add the time in the start_base_search_time fields. SS0 should also be run as a case search on a cron schedule of every 30 mins to 1 hour:
EG:
| inputlookup search_scheduler
| eval curr_key = _key
| sort - curr_key
| head 1
| eval temp = case(counter = 0, [ | savedsearch | return $time_field], counter = 1, [ | savedsearch ] , ... ,counter = n, [ | savedsearch | return $finish_time]
| eval start_base_search_time = if(counter = 0, temp, start_base_search_time)
| eval finish_search_time = if(counter = N, temp, finish_search_time)
| eval counter = counter + 1
| table counter start_base_search_time finish_search_time
| outputlookup append=true key_field=curr_key

by the time you get to the last search, this should have completed all search execution, if there was any failure, it will re-execute the search in next run as the counter wouldn't be incremented.
Note: you will also have the audit trail, as all executions are being written to the KV store.

0 Karma

iKate
Builder

Thanks for a profound answer! As a workaround seems it can help.

0 Karma

anmolpatel
Builder

@iKate glad you found it helpful

0 Karma
Get Updates on the Splunk Community!

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...