Splunk Search

How to create time dependent thresholds from lookup?

willemjongeneel
Communicator

Hello,

I have a question on using lookups in a search. I want to achieve that I have a scheduled search to compare to a threshold value that varies in time and is stored in a lookup. I've got a threshold value for every hour on every weekday (7*24 threshold values).

For now I've got the threshold statically in the query:

index=index1
| stats dc(Response) as value
| eval threshold=50000
| eval state=(case(value < threshold*0.25, "critical", value >= threshold*0.5, "normal", (value>=threshold*0.25 AND value < threshold*0.5), "warning"))
| table value, state

The CSV basically looks like this:
time (weekday & hour), threshold

How should I format the weekday and hour in the csv and how do I use the lookup in the search to get the threshold value that is applicable for the current time on the current weekday?

Thanks, kind regards,
Willem Jongeneel

0 Karma
1 Solution

harshpatel
Contributor

Hi @willemjongeneel,

You can have weekday our in the following format...
e.g.
Mon19

Let's say you have following lookup file

threshold   weekdayhour
5000         Mon19

You can convert _time in your lookup format and get threshold value from lookup as follows:

| makeresults count=1 
| eval weekdayhour=strftime(_time, "%a%H") 
| lookup threshold_weekday.csv weekdayhour output threshold

For information on time format variables: Link
Hope this helps.

View solution in original post

harshpatel
Contributor

Hi @willemjongeneel,

You can have weekday our in the following format...
e.g.
Mon19

Let's say you have following lookup file

threshold   weekdayhour
5000         Mon19

You can convert _time in your lookup format and get threshold value from lookup as follows:

| makeresults count=1 
| eval weekdayhour=strftime(_time, "%a%H") 
| lookup threshold_weekday.csv weekdayhour output threshold

For information on time format variables: Link
Hope this helps.

willemjongeneel
Communicator

Hello @kmorris_splunk, @harshpatel

I've tried your suggestions. I have time like this:
| eval tijd = strftime(_time,"%w%H")

And my CSV looks like this (tijd goes from 000 to 623):

threshold,tijd
50000,000
50000,001
50000,002
50000,003
50000,004
50000,005
50000,006

What I want to achieve is to get a distinct count of fieldvalues and compare that to the threshold that is applicable to the time the search starts. Complete search:

index="index1"
| eval tijd = strftime(_time,"%w%H")
| lookup threshold.csv tijd output threshold
| stats dc(Request) as value
| eval State=(case(value < threshold*0.25, "critical", value >= threshold*0.5, "normal", (value >= threshold*0.25 AND value < threshold*0.5), "warning"))
| table value, threshold, State

I do not get the result I want, when I try this search I only get a value of the distinct count, but not for threshold or State. I also tried the "makeresults count=1" option. Can you advise on how to change my search?

Thanks in advance!

Kind regards,
Willem Jongeneel

0 Karma

harshpatel
Contributor

Hi @willemjongeneel,

After you stats command only available fields will be "value". If you need other fields (let's say threshold) as well you can do something like this:

index="index1" 
| eval tijd = strftime(_time,"%w%H") 
| lookup threshold.csv tijd output threshold
| stats dc(Request) as value, values(threshold) as threshold
| eval State=(case(value < threshold*0.25, "critical", value >= threshold*0.5, "normal", (value >= threshold*0.25 AND value < threshold*0.5), "warning"))
| table value, threshold, State

Hope this helps.

0 Karma

willemjongeneel
Communicator

Hi @harshpatel

Thank you, this is what I needed.

Kind regards,
Willem Jongeneel

0 Karma

kmorris_splunk
Splunk Employee
Splunk Employee

I would concatenate the date_hour and date_wday fields to use as a key for your lookup.

[YOUR BASE SEARCH HERE]
| eval day_hour=date_hour . "_" . date_wday 
| table _time day_hour

This would create a field called day_hour that looks like the following: 9_monday.

Your CSV would then look like the following:

day_hour,threshold
9_monday,50000
10_monday,60000
11_monday,50000

0 Karma
Get Updates on the Splunk Community!

Webinar Recap | Revolutionizing IT Operations: The Transformative Power of AI and ML ...

The Transformative Power of AI and ML in Enhancing Observability   In the realm of IT operations, the ...

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...