Archive

Is it possible to make part of the SPL to execute asynchronously?

Contributor

Hello,

I have part of my alert search where based on the previous variable settings the database dump is triggered, per DB node. This is reflected by the |append part below:

|rename comment AS " ****************************** Start:    rtedump triggering ************************************************************************  "
 | eval rtetrigger=case(ALERT_TYPE="MAIN" AND trigger=0,"1",1<2,"0")
 | where rtetrigger = 0 
 | append
 [

   | dbxquery query="call \"ML\".\"ML.PROCEDURES::PR_ALERT_TYPE_ANALYSING_LAST_MINUTES_ALL_HOSTS\"('BWP', to_timestamp(to_nvarchar('2019-06-19 10:31:00', 'YYYY-MM-DD HH24:MI'),'YYYY-MM-DD HH24:MI'), ?)" connection="HANA_MLBSO"

     | eval HOST="ls5921/26"
     | eval Existing_Host=HOST 
     | eval FirstPart=substr(Existing_Host,1,4), SecondPart=substr(Existing_Host,5,7), SecondPart=split(SecondPart,"/") 
     | mvexpand SecondPart 
     | eval host_to_trigger=FirstPart+SecondPart
     | dedup host_to_trigger 
     | table host_to_trigger    
   | map maxsearches=20 search="dbxquery query=\"call SYS.MANAGEMENT_CONSOLE_PROC('runtimedump dump','$host_to_trigger$:30240',?)\" connection=\"HANA_MLBSO_BWP\" "
 ] 

Now, the dump triggering can be quite time intensive process and can take several minutes per host / node. Also I do not really need to wait for it - I would just return the text in the e-mail action that it has been triggered.
Given that, I would like to execute this part asynchronously as it does not bring any result back anyway.
Is it possible?
I know it goes into direction of custom alert action triggering / calling python out of the alert but somehow I feel more comfortable with having it in SPL.

Kind Regards,
Kamil

Tags (1)
0 Karma
1 Solution

Ultra Champion

Simplest solution for your case is probably to move the job to the background: https://docs.splunk.com/Documentation/Splunk/latest/Search/Aboutjobsandjobmanagement#Managing_jobs_w...

As far as I know there is no way to pass values to a report and then trigger that to run asynchronously directly from SPL. You can pull in the results of a previously executed saved search, but that doesn't feel like it suits your need. Unless you would use that to pull in the results from the alert search into the search running the DB dumps.

What you could perhaps do is configure the alert search to populate some lookup and the scheduled report use that lookup for the input variables (and have the scheduled report empty the lookup, to prevent further execution until you run the alert search again). As far as I know there is no way to pass variables to a report and trigger it to run in the background.

View solution in original post

0 Karma

Ultra Champion

Simplest solution for your case is probably to move the job to the background: https://docs.splunk.com/Documentation/Splunk/latest/Search/Aboutjobsandjobmanagement#Managing_jobs_w...

As far as I know there is no way to pass values to a report and then trigger that to run asynchronously directly from SPL. You can pull in the results of a previously executed saved search, but that doesn't feel like it suits your need. Unless you would use that to pull in the results from the alert search into the search running the DB dumps.

What you could perhaps do is configure the alert search to populate some lookup and the scheduled report use that lookup for the input variables (and have the scheduled report empty the lookup, to prevent further execution until you run the alert search again). As far as I know there is no way to pass variables to a report and trigger it to run in the background.

View solution in original post

0 Karma

Contributor

@FrankVl

Thank you, this is what I will do. Also because of other reason I guess I need to familiarize with the KV store, so I guess this is the high time.
Short questions to that:
- Is it possible to define a column that should have only unique values in the KV store? Like in my case I would like to allow only one record with same hostname.
- How would I easiest delete the KV store entries from the SPL? Do I have to find the corresponding _key(s) and then outputlookup empty values to these _key(s)?
Or is there any easier way? (in SPL, I do not want to use curl for that)

Kind Regards,
Kamil

0 Karma

Ultra Champion

I think the basic approach for removing KV-store records is to do an inputlookup, search for the records you want to keep and then do an outputlookup. Further questions on how to work with KV store are perhaps better suited for a separate post on Answers.

If my answer solves your question, please mark it as accepted.

0 Karma

SplunkTrust
SplunkTrust

Hi @damucka,

Doesn't have to be a custom alert action. You can just save this SPL search in a scheduled report and have it run as often as needed and then notify you by email once done:
https://docs.splunk.com/Documentation/Splunk/latest/Report/Schedulereports

Have you tried this already ?

Cheers,
David

0 Karma

Contributor

Hi David,

Thank you, good point. I did not think about this.
However I do not want to schedule my report blind - it has to be dependent on the input variables (HOST to trigger the dump) and also the timing will be set by the alert / variables there.
Shortly speaking this report would have to be embedded into the alert search.

So, is it possible to call the report out of the SPL, pass the variables to it and proceed with the SPL processing having the report executed in the background?

On the example above, in the line 16. (map command), how would I replace it with the saved search / report passing the hosttotrigger variable to it?

Kind Regards,
Kamil

0 Karma

SplunkTrust
SplunkTrust

Oh I see what you mean, if you want to include a custom part to your saved search you can call the results of your saved search using the savedsearch command:
https://docs.splunk.com/Documentation/Splunk/7.3.0/SearchReference/Savedsearch

Similarly you can map results from a previous search into a `savedsearch to apply them as filter.

Another way to do this is on a dashboard where you could split the query into two parts and use a macro to define whichever value you wish to modify as the search gets executed :
https://docs.splunk.com/Documentation/Splunk/latest/Knowledge/Definesearchmacros

Let me know if this helps.

0 Karma