Splunk Enterprise Security

How to edit my search to use eval variables in subsearches

splunker1981
Path Finder

Hello Splunk experts,

Stuck trying to get something working and hoping one of you experts can point me in the right direction. I have a search that uses append to join two searches, each of which focuses on a specific time windows. I then passes the data to a custom command for further processing. This all works great and is what I use to create an alert in ES.

The problem I am running into is when I try to execute the drilldown from ES, this is because I am not sure how to replace each of the earliest= and latest= time modifiers in each of the individual searches. What I have played around with is using the addinfo command and then trying to leverage info_min and info_max times. However, if I use those in the search nothing comes back. I've tried $info_min_time$, as well as % and just one $, none of those seem to do the trick. I've also tried using eval to set the info_* values then calling that in the search as a filter, that too does not seem to work. If I manually replace the the strings info_min and info_max_time with their respective epoch values within the search that does the trick. So the question is how to use those within my search? Here is an example search that I am playing around with which uses the time picker define time windows and then tries to set the time for each of the searches I am running.

Original working search used to create the alert

search index=myIndex earliest=-3h latest=-2h |stats Here
 | fields fieldA, fieldB, fieldC 
 | eval x=f
 | append 
     [ search index=myIndex earliest=-2h latest=-1h | stats Here...
      | fields fieldA, fieldB, fieldC 
      | eval x=l
     ] 
  |custom stuff_here

What I am trying to use for the drilldown, if I replace info_min or max time with an epoch value stored in those I get the correct output. So the question is how to I pass the start/end date range stored in the time picker to my search?

| stats count 
| fields - count
| addinfo 
| eval start=info_min_time
| eval end=info_max_time
| search index=myIndex info_min_time |stats Here
| fields fieldA, fieldB, fieldC 
| eval x=f
| append 
    [ search index=myIndex info_max_time | stats Here...
     | fields fieldA, fieldB, fieldC 
     | eval x=l
    ] 

Thanks for the help experts!

0 Karma
1 Solution

DalJeanis
Legend

You are slipping between talking about searches, talking about alerts, and talking about drilldowns.

The simplest solution method I can think of, based on your statement that you will always be using successive 1 hour windows, is pretty simple. Make a single 2-hour window, calculate the midpoint, and set x based on that. Should actually calculate marginally faster.

 search index=myIndex earliest=-3h latest=-1h 
| addinfo
| eval info_midpoint = round((info_max_time+info_min_time)/2,0)
| eval x=if(_time<info_midpoint,f,1)
|stats Here and remember to use " by x"
| fields x, fieldA, fieldB, fieldC 
   |custom stuff_here

View solution in original post

0 Karma

DalJeanis
Legend

You are slipping between talking about searches, talking about alerts, and talking about drilldowns.

The simplest solution method I can think of, based on your statement that you will always be using successive 1 hour windows, is pretty simple. Make a single 2-hour window, calculate the midpoint, and set x based on that. Should actually calculate marginally faster.

 search index=myIndex earliest=-3h latest=-1h 
| addinfo
| eval info_midpoint = round((info_max_time+info_min_time)/2,0)
| eval x=if(_time<info_midpoint,f,1)
|stats Here and remember to use " by x"
| fields x, fieldA, fieldB, fieldC 
   |custom stuff_here
0 Karma

splunker1981
Path Finder

Thanks for the reply, DalJeanis. The problem is if you do a search index=myIndex earliest=3h latest=-1h then I'm already working with incorrect time range of events. What I would like to do is have it search the time range defined in the time picker, then split those into 1 hr windows. So in essence run the search with earliest being equal to info_min_time and latest equal to info_max_time. Something like search index=myIndex earliest=info_min_time latest=info_max_time. Whenever I do that nothing comes back, so the question is how to define earliest/latest ranges in my search so it looks something like this: index="myIndex [|stats count |fields -count | addinfo | eval earliest=info_min_time |eval latest=info_max_time |table earliest, latest ]

Thanks

0 Karma

DalJeanis
Legend

Right, I've just showed you an example of how to make that happen. You use the time picker to pick the WHOLE time range, both hours, and feed it into earliest=$timepickerEarliest$ latest=$timepickerLatest$ then split it into first-half and second-half by using addinfo and calculating the midpoint.

If you absolutely HAVE to have the time-picker picking only the later of the two ranges, then you'll need a macro that will recalculate the desired value for earliest based on
macroDesiredEarliest=($timepickerEarliest$+$timepickerEarliest$-$timepickerLatest$)

Of course, you could just have a single timepicker for the start of the second hour,then use the macro to calculate earliest=$timepickerValue$-1440 and latest=$timepickerValue$+1440 .

0 Karma

splunker1981
Path Finder

My other questions is; what's the best way to replace the earliest and latest timeranges in my ES drilldown searche [ search index=myIndex earliest=-2h latest=-1h | stats Here... with the range from the time picker. In theory I will always be dealing with 2 hour windows and I need each search to be grouped using the eval x=l or x=f so I can pass is to my custom command.

0 Karma
Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...