Splunk Search

How to exclude multiple time ranges from multiple searches by defining the time ranges only once?

kaurinko
Communicator

I have a number of services monitored by Splunk, and as the maintenance breaks should be excluded from performance calculations, I found from the community a solution of type:

index=my_data_index
| eval MaintStart = strptime("2018/04/25 21:00:00 AM","%Y/%m/%d %H:%M:%S %p")
| eval MaintEnd   = strptime("2018/04/25 22:00:00 AM","%Y/%m/%d %H:%M:%S %p")
| where NOT ( _time > MaintStart AND _time < MaintEnd )
| eval MaintStart = strptime("2018/04/27 22:00:00 AM","%Y/%m/%d %H:%M:%S %p")
| eval MaintEnd   = strptime("2018/04/28 02:00:00 AM","%Y/%m/%d %H:%M:%S %p")
| where NOT ( _time > MaintStart AND _time < MaintEnd )
| chart  count(eval(status="OK")) AS OK, count(eval(status="FAIL")) AS FAIL by _time  span=1w
| eval "Availability %"=round(100.000*OK/(OK+FAIL),4)

In the above example I have two time ranges excluded from the search and over time there will be more. As I have several performance monitoring searches, it would be tremendously good to define the exclusions only once and apply them where appropriate. Any ideas how that could be done?

0 Karma
1 Solution

Ayn
Legend

One suggestion is to put this in a lookup file and then do some subsearch magic.

For instance if you have the following lookup file, let's call it maintenancetimes.csv:

MaintStart,MaintEnd
2018/04/25 21:00:00 AM,2018/04/25 22:00:00 AM
2018/04/27 22:00:00 AM,2018/04/28 02:00:00 AM

Then call this in a subsearch, and use return to create a search query that you can pass as raw input to the outer search:

index=my_data_index NOT [| inputlookup maintenancetimes.csv | convert timeformat="%Y/%m/%d %H:%M:%S %p" mktime(MaintEnd) mktime(MaintStart) | eval search="_time>".MaintStart." AND _time<".MaintEnd | return 500 $search]

After subsearch has been evaluated, the whole base search will look like

index=my_data_index NOT ((_time>1524682800 AND _time<1524686400) OR (_time>1524859200 AND _time<1524873600))

which hopefully is what you want 🙂

The only gotcha here is that return requires you to specify the max number of results to return. Just put something sensible and large in there. 🙂

View solution in original post

Ayn
Legend

One suggestion is to put this in a lookup file and then do some subsearch magic.

For instance if you have the following lookup file, let's call it maintenancetimes.csv:

MaintStart,MaintEnd
2018/04/25 21:00:00 AM,2018/04/25 22:00:00 AM
2018/04/27 22:00:00 AM,2018/04/28 02:00:00 AM

Then call this in a subsearch, and use return to create a search query that you can pass as raw input to the outer search:

index=my_data_index NOT [| inputlookup maintenancetimes.csv | convert timeformat="%Y/%m/%d %H:%M:%S %p" mktime(MaintEnd) mktime(MaintStart) | eval search="_time>".MaintStart." AND _time<".MaintEnd | return 500 $search]

After subsearch has been evaluated, the whole base search will look like

index=my_data_index NOT ((_time>1524682800 AND _time<1524686400) OR (_time>1524859200 AND _time<1524873600))

which hopefully is what you want 🙂

The only gotcha here is that return requires you to specify the max number of results to return. Just put something sensible and large in there. 🙂

kaurinko
Communicator

I had a hard time getting this to work, but eventually I made it. There was one thing I had to change from the solution you proposed: I had to enclose the whole expression in square brackets [] inside round brackets () and then it started doing what I wanted. Or it does what I want.

index=my_data_index "Something to match"
NOT ([| inputlookup maintenancetimes.csv
     | convert timeformat="%Y/%m/%d %H:%M:%S %p" mktime(MaintStart) mktime(MaintEnd)
     | eval search="_time>".MaintStart." AND _time<".MaintEnd
     | return 500 $search])

That does what I want. I do not know Splunk well enough to say why the extra round brackets were necessary.

0 Karma

kaurinko
Communicator

This looks very promising. I am not familiar with inputlookup, so I briefly looked at the manual page. The hint to using table looks tempting, but in any case there clearly is a solution to the problem.

Thanks a lot!

0 Karma
Get Updates on the Splunk Community!

Part 2: A Guide to Maximizing Splunk IT Service Intelligence

Welcome to the second segment of our guide. In Part 1, we covered the essentials of getting started with ITSI ...

Part 1: A Guide to Maximizing Splunk IT Service Intelligence

As modern IT environments continue to grow in complexity and speed, the ability to efficiently manage and ...

Exporting Splunk Apps

Join us on Monday, October 21 at 11 am PT | 2 pm ET!With the app export functionality, app developers and ...