Splunk Search

Is it possible to get info_min_time and info_max_time of from main search into subsearch?

simon_b
Path Finder

Hi, let me try to explain my problem. I have a main search with a selected timerange (typically "last 4 hours") which is selected with the time picker. In addition, I join a subsearch where I want to calculate the average of some values with a bigger time range (typically "last 7 days"). To do that I use the earliest and latest commands in the subsearch.

Is it somehow possible to get/access the values of info_min_time and info_max_time (which the addinfo command produces) from the main search into the subsearch?

Labels (4)
0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @simon_b,

it isn't possible to pass parameters from tha main search to a sub search, you should charge the logic of your search.

But, what the reason to do that?

maybe you could create a main search with two parts containing main search and subsearch.

e.g. something like this:

(<search1>) OR (<search2> earliest=-7d latest=now)
| addinfo
| ...

Ciao.

Giuseppe

0 Karma

simon_b
Path Finder

Grazie @gcusello per la risposta.

Unfortunately the solution doesn't work for my case. As I explained, in the subsearch I would like to calculate the average for a value, but only for the specified hours of the main search in the last 7 days.

For example: If my main search looks at the data from 06:00 to 10:00 for today I want to calculate the average for the last 7 days also only for the hours 06:00 to 10:00 from each day.

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @simon_b,

you have to add an eval command to identify the values to consider, something like this:

<your_search>
| eval interesting_value=if(date_hour>5 AND date_hour<11,"value,"")
| timechart avg(intersting_value) AS average

in few words, you have to create a field that consider only the values in the hours you need (6.00-10.00) and then calculate avg on this field.

Ciao.

Giuseppe

0 Karma

simon_b
Path Finder

Ciao @gcusello,

 

yes I know how to do that. The problem is that I need the timerange of the main search to filter for that hours in the subsearch.

0 Karma

PickleRick
SplunkTrust
SplunkTrust

It does not work this way.

Subsearches (in the most typical form) are not "called" from the main search. The subsearch is run first then its results are passed into the main search rendered accorting to the explicit or implicit return command.

So the subsearch runs with the timepicker-set search boundaries or the earliest/latest explicitly stated parameters. There is no way of "passing variables" from the main search to a subsearch.

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @simon_b,

don't think in terms of subsearch, create your search using parenthesis in the main search and the eval to take only the values you need.

If you could share your search in terms of rules and desidered output not considering syntax?

Ciao.

Giuseppe

0 Karma

simon_b
Path Finder

Hi @gcusello , the search at the moment looks something like this:

index=index1
| regex id != "2|3|4" 
| join id
    [ search index=index2 latest=-0d earliest=-7d
    | regex id != "2|3|4" 

    ***some eval commands***

    | where (date_hour>hour_min) AND (date_hour<hour_max) 
    | stats mean(value1) AS "mean_value1" by id]
| table _time id mean_value1

The parameters I need in the subsearch are "hour_min" and "hour_max" which represent the earliest and latest hour of the main searches time picker.

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @simon_b,

you could reverse the join, something like this:

index=index2 latest=-0d earliest=-7d
    | regex id != "2|3|4" 

    ***some eval commands***
| join id
    [ search index=index1 | regex id != "2|3|4" | addinfo]
    | where (date_hour>hour_min) AND (date_hour<hour_max) 
    | stats mean(value1) AS "mean_value1" by id
| table _time id mean_value1

Ciao.

Giuseppe

0 Karma

PickleRick
SplunkTrust
SplunkTrust

Don't use join if you absolutely don't have to.

It seems that you're not "thinking in SPL" yet 😉

You usually can get around the joining thing with cleverly constructed stats.

0 Karma
Get Updates on the Splunk Community!

How to Monitor Google Kubernetes Engine (GKE)

We’ve looked at how to integrate Kubernetes environments with Splunk Observability Cloud, but what about ...

Index This | How can you make 45 using only 4?

October 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...

Splunk Education Goes to Washington | Splunk GovSummit 2024

If you’re in the Washington, D.C. area, this is your opportunity to take your career and Splunk skills to the ...