Splunk Search

How to use a different time range within a subsearch?

lblackey
Engager

Splunk rookie here, so please be gentle. I am hoping someone can help me with a date-time range issue within a subsearch. My goal is to create a dashboard where you enter a date-time range (either from a time picker or something like the last 15 minutes), and then have it retrieve results for the current search as well as the same time range from a week ago. The time range needs to be the difference between the "earliest" and "latest" values. I'd like to expand this to show me the same time range from the current week, the previous week, then from 3 weeks back, and ending with 4 weeks back.

The problem I am having is passing the "earliest" and "latest" values to the subsearch. For example, if earliest is 10/15/2015:10:00:00 and latest is 10/15/2015:12:00:00 (a 2 hour span), you would think these two values would be passed to the subsearch. Then in the subsearch, you would think that earliest=-7d would now be 10/8/2015:10:00:00, and latest=-7d would now be 10/8/2015:12:00:00, which is still a two hour time span, but for the previous week. However, it looks like the earliest and latest values are being set to the same date-time value, thus returning zero results. This seems to happen whether I specify weeks, days, minutes, seconds. I also don't want to explicitly specify the values for earliest and latest in the subsearch unless it can be an assignment from a variable or just subtracting weeks, days, hours, or minutes. I have also tried assigning the value of a variable to earliest and latest within the subsearch, but this does not seem possible.

Would someone have an idea of how to make what I am trying to do work? Below is a code example (does not work) that shows what I was trying to do with variable assignment in the subsearch. Thanks in advance for any assistance you can offer.

index=TEST CLASSNAME=JOB | eval DiffTime=latest-earliest | eval BeginTime=earliest-DiffTime-604800 | 
eval CompleteTime=latest-604800 | stats count(eval(SITENAME == "PHX")) as Phoenix | eval Key="First Week"
append [ search index=TEST CLASSNAME=JOB earliest=-$BeginTime$ latest=-$CompleteTime$ | 
stats count(eval(SITENAME == "PHX")) as Phoenix | eval Key="Second Week" ] | table Key, Phoenix

woodcock
Esteemed Legend

You should UpVote any answers that were helpful and pick the best one and click Accept. The basic problem is that the values from earliest and latest coming from a subsearch must be numbers (no relative time specifiers).

0 Karma

woodcock
Esteemed Legend

I am not sure that I understand this comment of yours:

Unfortunately, moving "stats count by SITENAME" within the map search string kept me from doing the time conversion of _time to "date", and I could no longer get it to appear in my search results. I even tried moving it within the map search string, but could not get it to work.

But perhaps this twist on the solution works for you:

| noop | stats count AS pos | eval pos="1,2,3,4" | makemv delim="," pos | mvexpand pos | addinfo | eval info_min_time=info_min_time - case(pos == 1, 0, pos == 2, 7*86400, pos == 3, 14*86400, pos == 4, 21*86400) | eval info_max_time=info_max_time - case(pos == 1, 0, pos == 2, 7*86400, pos == 3, 14*86400, pos == 4, 21*86400) | map search="search index=TEST  CLASSNAME=JOB earliest=$info_min_time$ latest=$info_max_time$ | bucket _time span=1d | stats count by SITENAME _time" | stats count by SITENAME _time
0 Karma

lblackey
Engager

I was able to take the suggestion from woodcock, modify it from an "if" statement to a "case" statement, and got it to run showing a four week comparison. Here is the search string:

| noop | stats count AS pos | eval pos="1,2,3,4" | makemv delim="," pos | mvexpand pos | addinfo | 
eval info_min_time=info_min_time - case(pos == 1, 0, pos == 2, 7*86400, pos == 3, 14*86400, pos == 4, 21*86400) | 
eval info_max_time=info_max_time - case(pos == 1, 0, pos == 2, 7*86400, pos == 3, 14*86400, pos == 4, 21*86400) | 
map search="search index=TEST CLASSNAME=JOB earliest=$info_min_time$ latest=$info_max_time$ | stats count by SITENAME" | 
sort str(SITENAME) | table SITENAME, count

I had trouble figuring out the response from somesoni2. If I paste the search string and run it in Splunk, it gives me an error of:
"Error in 'search' command: Unable to parse the search: 'AND' operator is missing a clause on the left hand side".

I found the results from woodcock's suggestion hard to work with. As an example, if "stats count by SITENAME" was moved outside of the quoted map search string, the subsearch results were truncated at 10000. Moving it inside the quotes resolved this. Here is the search string where the subsearch results were truncated:

| noop | stats count AS pos | eval pos="1,2,3,4" | makemv delim="," pos | mvexpand pos | addinfo | 
eval info_min_time=info_min_time - case(pos == 1, 0, pos == 2, 7*86400, pos == 3, 14*86400, pos == 4, 21*86400) | 
eval info_max_time=info_max_time - case(pos == 1, 0, pos == 2, 7*86400, pos == 3, 14*86400, pos == 4, 21*86400) | 
map search="search index=TEST  CLASSNAME=JOB earliest=$info_min_time$ latest=$info_max_time$" | 
convert timeformat="%Y-%m-%d" ctime(_time) AS date | stats count by SITENAME date

Unfortunately, moving "stats count by SITENAME" within the map search string kept me from doing the time conversion of _time to "date", and I could no longer get it to appear in my search results. I even tried moving it within the map search string, but could not get it to work.

Another interesting thing about woodcock's solution is that is jumps right into "Finalizing results", and zero events are found. But I do see the correct count numbers in my search results. I assume this is because the subsearch is doing all the work, and the parent search is what finds zero events?

I would be interested in getting the response from somesoni2 to work using the append command with a subsearch. Again, Splunk rookie here, so please hang in there with me. I appreciate all the help you can offer.

MuS
SplunkTrust
SplunkTrust

Hi lblackey,

if you just want to compare a count on days or weeks with previous days or weeks you could use the timewrap app https://splunkbase.splunk.com/app/1645/ and run this search:

index=TEST CLASSNAME=JOB SITENAME="PHX" earliest=-3w@w latest=-0w@w | timechart count | timewrap w

This is just an example and I'm not sure if this is what you want....

cheers, MuS

woodcock
Esteemed Legend

You should click "Accept" to close this Question and upvote the answers that helped you come up with your final solution

0 Karma

woodcock
Esteemed Legend

Do it like this:

|noop|stats count AS pos | eval pos="1,2" | makemv delim="," pos | mvexpand pos | addinfo | eval info_min_time=info_min_time - if((pos=1), 0, 7*24*60*60) | eval info_max_time=info_max_time - if((pos=1), 0, 7*24*60*60) | map search="search earliest=$info_min_time$ latest=$info_max_time$ | PUT YOUR SEARCH HERE"
0 Karma

somesoni2
SplunkTrust
SplunkTrust

Here is how I'll do

index=TEST CLASSNAME=JOB | stats count(eval(SITENAME == "PHX")) as Phoenix | eval Key="First Week"
| append [search index=TEST CLASSNAME=JOB [| gentimes start=-1 | addinfo | eval earliest=relative_time(info_min_time,"-7d") | eval latest=relative_time(info_max_time,"-7d") | table earliest latest ] |  stats count(eval(SITENAME == "PHX")) as Phoenix | eval Key="Second Week"] | table Key, Phoenix
0 Karma

somesoni2
SplunkTrust
SplunkTrust

Where,
| gentimes - to get a single dummy rows for processing
| addinfo - adds common search information, including search timerange to current search result (dummy row)
info_min_time and info_max_time - are the fields added by addinfo command and denotes current search's earlieset and latest

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...