Splunk Search

How to read data from a lookup table and then use these values to compare and make a decision (more like a matrix table)?

sh254087
Communicator

I have a lookup table that looks like this:
Variable1---variable2---Score
0--- null ---3

0---500---2

500---1000---1

1000--- null ---0

And I have this code -

index="myIndex" sourcetype=mySource Application_Name = "BACKUP EXEC"

| fields Application_Name, HostName 
| join  HostName 
    [ search index="myIndex" sourcetype=mySource 
    |eval minValue=min(A,B,C,D)
    |eval diff_A=(A-minValue)
    |eval score_A=case(diff_A=0,"3", 
                       diff_A>0 AND (diff_A<=500),"2",
                       diff_A>500 AND (diff_A<=1000),"1",
                     diff_A>1000,"0") ]
    | table score_A

Requirement:
The aim is to check in which range does the value "diff_A" falls into and return the appropriate score value. I need to somehow replace the hard-coded values with the values substituted from the table (as in matrix form).

Apparently I want have something like this:

  |eval score_A=case(diff_A=Varable1_1, Score_1, 
                     diff_A>Varable1_2 AND (diff_A<=Varable2_2), Score_2,
                     diff_A>Varable1_3 AND (diff_A<=Varable2_3), Score_3,
                   diff_A>Varable1_4, Score_4) ]

I'm able to read the lookup file data by adding this line-

|lookup tco2_lookup.csv Variable1 as var1, Variable2 as var2 OUTPUT Score as score

before the eval command. But I'm not able to find a way to substitute these values from lookup in those places as required.

Please let me know how I can get this working.

0 Karma

niketnilay
Legend

@sh254087, based on the description of your issue seems like you have hard-coded threshold that you need to replace with lookup table. The 4 scores in your lookup example seem like 4 different threshold ranges. You should create the row for each score in a way that similar logic can be applied whichever row is picked. With the lookup table you have provided in example, using non-numeric values null, you can not apply the same range logic for each row without further conversion. Instead of complicating the logic if you define ranges in sequential manner, it should be easy for example:

Change your lookup file tco2_lookup.csv as following:

min,max,score
-1,0,3
0,499,2
499,999,1
999,1000,0

Fetch the four rows using inputlookup and use Splunk map command to match against minValue to decide the score. Following is a run anywhere example:

|  inputlookup tco2_lookup.csv 
|  map search="|  makeresults
|  eval A=1500, B=2550, C=6150, D=999
|  append [|  makeresults
|  eval A=500, B=250, C=150, D=600]
|  append [|  makeresults
|  eval A=200, B=150, C=40, D=500]
|  append [|  makeresults
|  eval A=-250, B=0, C=-100, D=-15]
|  append [|  makeresults
|  eval A=123, B=0, C=240, D=315]
|  eval minValue= case((A<B OR A=B) AND (A<C OR A=C) AND (A<D OR A=D), A,
                       (B<A OR B=A) AND (B<C OR B=C) AND (B<D OR B=C), B, 
                       (C<A OR C=A) AND (C<B OR C=B) AND (C<D OR C=D), C, 
                       (D<A OR D=A) AND (D<B OR D=B) AND (D<C OR D=C), D)
|  search minValue>$min$ AND minValue<=$max$
|  eval score=$score$"

PS:

1) The query inside | map search=" with | makeresults and | append are used to generate some dummy data for testing. Please replace with you actual search query i.e. following just generates mock data for testing:

 |  makeresults
 |  eval A=1500, B=2550, C=6150, D=999
 |  append [|  makeresults
 |  eval A=500, B=250, C=150, D=600]
 |  append [|  makeresults
 |  eval A=200, B=150, C=40, D=500]
 |  append [|  makeresults
 |  eval A=-250, B=0, C=-100, D=-15]
 |  append [|  makeresults
 |  eval A=123, B=0, C=240, D=315]

2) Looking at your actual search query in the example, I feel JOIN is not required so try to use base filters instead for better query performance.

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"
0 Karma