Splunk Search

How to pass value in subsearches without searching

NAGA4
Engager

Hi All,

I am having a requirement like this. 

First I need to fetch all the failed searches (lets say skipped searches) by their savedsearch_name and scheduled_time. 

If it is skipped on that scheduled_time, Then I need to check if that scheduled_time lies between  durable_cursor AND next scheduled_time

Lets say savedsearch_name- called ABC failed (Skipped) at 1712121019. 

So now I need to search if this above failed scheduled_time value lies between upcoming durable_cursor and next scheduled_time. The next scheduled_time is 1712121300 and in this event I see durable_cursor value is 1712121000. Which means my failed time covered in this run. 

How to detect this via a splunk query. My failed searches are covered or not in next run.

I tried to apply subsearch logic to get failed savedsearch_name and scheduled_time. I can pass savedsearch_name but not the scheduled_time. So my idea is I need to run a first query to take failed savedsearch name and its associated failed scheduled_time. And in the second query I need to check if scheduled_time lies between durable_cursor and next scheduled_time. How to achieve this. 

 

Any inputs would be appreciated. Thanks 

Labels (4)
Tags (1)
0 Karma
1 Solution

bowesmana
SplunkTrust
SplunkTrust

I've not used durable searches, so I am not totally sure how they work in terms of timestamp data in the index, however, have you tried to include the durable_cursor in your stats like this

index=_internal sourcetype=scheduler earliest=-1h@h latest=now 
``` Find the latest durable_cursor for this saved search ```
| eventstats max(durable_cursor) as durable_cursor by savedsearch_name

``` and include it in the stats ```
| stats latest(status) as FirstStatus max(durable_cursor) as durable_cursor by scheduled_time savedsearch_name 
| search NOT FirstStatus IN ("success","delegated_remote")

However, I don't see how you can do the if test when you do not have next_scheduled_time in the _internal index data - you will need to use the REST api to get next scheduled time.

Or maybe you can make the eventstats/stats do this

| eventstats max(durable_cursor) as durable_cursor max(eval(if(status="success", scheduled_time, null()))) as max_success_scheduled_time by savedsearch_name
| stats latest(status) as FirstStatus max(durable_cursor) as durable_cursor max(max_success_scheduled_time) as max_success_scheduled_time by scheduled_time savedsearch_name 

but I am unfamiliar with durable searches, so don't know how these timestamps work

View solution in original post

0 Karma

NAGA4
Engager

Thank you @bowesmana . However I could not get the results with above one. Let me try to put the requirement with example again

Search 1 : 

index=_internal sourcetype=scheduler earliest=-1h@h latest=now | stats latest(status) as FirstStatus by scheduled_time savedsearch_name | search NOT FirstStatus IN ("success","delegated_remote")

This query will give result like below

scheduled_time savedsearch_name FirstStatus
1712131500 ABC skipped

Now I wanted to take the savedsearch_name ABC and the scheduled_time=1712131500 into next query and search like below

index=_internal sourcetype=scheduler savedsearch_name="ABC" earliest=-1h@h latest=now
| eval failed_time="1712131500"
| eval range=if((failed_time>=durable_cursor AND failed_time<=scheduled_time),"COVERED","NOT COVERED")
| where durable_cursor!=scheduled_time
| table savedsearch_name durable_cursor scheduled_time range

04-03-2024 05:38:18.025 +0000 INFO SavedSplunker ... savedsearch_name="ABC", priority=default, status=success, durable_cursor=1712131400, scheduled_time=1712131600

Combining both into one search is fine. If not taking the values and passing into lookup and then refer later is also fine

0 Karma

bowesmana
SplunkTrust
SplunkTrust

I've not used durable searches, so I am not totally sure how they work in terms of timestamp data in the index, however, have you tried to include the durable_cursor in your stats like this

index=_internal sourcetype=scheduler earliest=-1h@h latest=now 
``` Find the latest durable_cursor for this saved search ```
| eventstats max(durable_cursor) as durable_cursor by savedsearch_name

``` and include it in the stats ```
| stats latest(status) as FirstStatus max(durable_cursor) as durable_cursor by scheduled_time savedsearch_name 
| search NOT FirstStatus IN ("success","delegated_remote")

However, I don't see how you can do the if test when you do not have next_scheduled_time in the _internal index data - you will need to use the REST api to get next scheduled time.

Or maybe you can make the eventstats/stats do this

| eventstats max(durable_cursor) as durable_cursor max(eval(if(status="success", scheduled_time, null()))) as max_success_scheduled_time by savedsearch_name
| stats latest(status) as FirstStatus max(durable_cursor) as durable_cursor max(max_success_scheduled_time) as max_success_scheduled_time by scheduled_time savedsearch_name 

but I am unfamiliar with durable searches, so don't know how these timestamps work

0 Karma

bowesmana
SplunkTrust
SplunkTrust

You can do something like this - I don't know what you mean by the durable_cursor, but this will append the list of scheduled save searches with the calculated next scheduled time from the rest data and then join the data together based on the saved search name

search your skipped searches
calculate durable_cursor
| append [
  | rest splunk_server=local "/servicesNS/-/-/saved/searches" search="is_scheduled=1 disabled=0"
  | eval next_scheduled_time_e=strptime(next_scheduled_time, "%F %T %Z")
  | fields title next_scheduled_time_e
  | rename title as savedsearch_name
]
| stats values(*) as * max(durable_cursor) as durable_cursor by savedsearch_name
| where next_scheduled_time_e>durable_cursor

 

0 Karma
Get Updates on the Splunk Community!

Index This | Forward, I’m heavy; backward, I’m not. What am I?

April 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

A Guide To Cloud Migration Success

As enterprises’ rapid expansion to the cloud continues, IT leaders are continuously looking for ways to focus ...

Join Us for Splunk University and Get Your Bootcamp Game On!

If you know, you know! Splunk University is the vibe this summer so register today for bootcamps galore ...