Alerting

multiple conditions spl help

bestSplunker
Contributor

hello everyone!
I have a program that counts the number of requests for website api per minute.the log format is as following, where field time is the time of the statistics.

    request_domain             time                           uri                      min_count
    www.test.com         11/Jul/2019 15:51                  /api/test                     19
    www.test.com         11/Jul/2019 15:51                  /api/exmple                   208
    m.test.com           11/Jul/2019 15:52                 /api/search                    80
    www.test.com         11/Jul/2019 15:52                  /api/test                     31
    www.test.com         11/Jul/2019 15:52                  /api/exmple                   253
    m.test.com           11/Jul/2019 15:52                 /api/search                     62

I want to create an alert based on the following requirements, but I don't know how to do it.

If the number of uri requests is greater than 100 times per hour.

  1. Compare the previous hour, if the growth rate is greater than 80%, then alert
  2. Compare the same time period yesterday, if the growth rate is greater than 80%, then the alarm.

If the number of uri requests is less than 100 times per hour.

  1. Compare the previous hour, if the growth number is greater than 50 times, then alert
  2. Compare the same time period yesterday, if the growth number is greater than 50 times,then alert

I think it needs to be split into at least 2 alert to achieve

all help will be greatly appreciated!

Tags (1)
0 Karma
1 Solution

woodcock
Esteemed Legend

First of all, this is a prime candidate for an hourly summary index rollup so consider that (it makes the SPL easy). Short of that, try this:

index=<You should always specify an index> AND sourcetype=<And sourcetype too>
((earliest=-1d@h-1h latest=-1d@h) OR (earliest=-2h@h latest=now))
| bin span=1h _time
| stats count BY uri _time
| streamstats count AS _time BY uri
| eval _time = case(_time==1, "last_hour", _time==2, "prev_hour", _time==3, "yesterday_last_hour", true(), "ERROR")
| xyseries uri _time count
| eval growth_from_last_hour = (100 * (last_hour -           prev_hour) /           prev_hour)
| eval growth_from_yesterday = (100 * (last_hour - yesterday_last_hour) / yesterday_last_hour)
| where (((last_hour > 100) AND ((growth_from_last_hour > 80) OR (growth_from_yesterday > 80))) OR ((last_hour < 100) AND (growth_from_last_hour > 80)))

View solution in original post

0 Karma

woodcock
Esteemed Legend

First of all, this is a prime candidate for an hourly summary index rollup so consider that (it makes the SPL easy). Short of that, try this:

index=<You should always specify an index> AND sourcetype=<And sourcetype too>
((earliest=-1d@h-1h latest=-1d@h) OR (earliest=-2h@h latest=now))
| bin span=1h _time
| stats count BY uri _time
| streamstats count AS _time BY uri
| eval _time = case(_time==1, "last_hour", _time==2, "prev_hour", _time==3, "yesterday_last_hour", true(), "ERROR")
| xyseries uri _time count
| eval growth_from_last_hour = (100 * (last_hour -           prev_hour) /           prev_hour)
| eval growth_from_yesterday = (100 * (last_hour - yesterday_last_hour) / yesterday_last_hour)
| where (((last_hour > 100) AND ((growth_from_last_hour > 80) OR (growth_from_yesterday > 80))) OR ((last_hour < 100) AND (growth_from_last_hour > 80)))
0 Karma

jkat54
SplunkTrust
SplunkTrust

If you want to do this in one search, here is how.

1st to eval the time period that is 24hrs ago from -1h@h to @h 

| eval 25hrAgo=relative_time(now(),"-25h@h")
| eval 24hrAgo=relative_time(now(),"-24h@h")

2nd to get the counts of uri requests per hour in our time frames

|  stats count(eval(_time>=24hrAgo AND _time<=25hrAgo)) as yesterdayCount  count(eval(_time>=relative_time(now(),"-1h@h")) AND _time<=relative_time(now(),"-0h@h")) as lastHourCount  count(eval(_time>=relative_time(now(),"-2h@h")) AND _time<=relative_time(now(),"-1h@h")) as 2hrAgoCount 

3rd to math if alert is needed

 | eval 80percA=if((2hrAgoCount/(lastHourCount-2hrAgoCount)*100)>80,1,0)
 | eval 80percB=if((yesterdayCount/(lastHourCount-yesterdayCount)*100)>80,1,0)
| eval 50percA=if((2hrAgoCount/(lastHourCount-2hrAgoCount)*100)>50,1,0)
 | eval 50percB=if((yesterdayCount/(lastHourCount-yesterdayCount)*100)>50,1,0)

Now to send email based on if any of above = 1

 |  where 80percA=1 OR 80percB=1 OR 50percA=1 OR 50percB=1 
  | map search="|sendemail subject=\"$80percA$ $80percB$ $50percA$ $50percB$\" to=youremail@mail.com"

This should send you an email with subject of 1 0 0 0 for example which would mean there were more than 100 requests in the last hour and that was 80% more than the same hour 2 hours ago.

0 Karma

bestSplunker
Contributor

@jkat54 thank you for your reply.

When I tried to search for the first step, splunk prompted an error.

 Error in 'stats' command: The eval expression for dynamic field 'eval(_time>=24hrAgo AND _time<=25hrAgo)' is invalid. Error='The operator at 'hrAgo AND _time<=25hrAgo' is invalid.'

This error may be because you are using numbers as the beginning of the field

In addition, my program is counting the number of uri requests per minute. maybe i need add |bin _time span=1h to the search statement you provided?

0 Karma

jkat54
SplunkTrust
SplunkTrust

I
Guess the stats doesn't work. With AND in the eval.. I wrote this on my phone without testing

0 Karma

bestSplunker
Contributor

@jkat54 How to fix this error? _time>=24hrAgo AND _time<=25hrAgo

the field _time' with a minute (%Y/%m/%d %H:%M:%S) , but the 24hrAgo and 25hrAgo with a hour time format (%Y/%m/%d %H)

0 Karma
Get Updates on the Splunk Community!

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...