Alerting

How to create an alert that checks number of calls per second in a 5 minute window?

mdr003
Explorer

Hi team -

We currently use Elastic to perform log storage and alerting, but we are in the process of converting to Splunk. Currently we have some Elastic alerting that runs every five minutes, and looks for the number of calls to a specific Apigee service. It works out how many calls were made in each 1 second interval, and alerts if the traffic in one or more intervals is above a threshold.

Is it possible to do the same in Splunk? Run a query on hits in the last 5 minutes, sort it to provide a count for each 1 second interval, and work out the highest count value?

Labels (1)
0 Karma

mdr003
Explorer

@richgalloway@efavreau

Thank you both for the suggestions - I will look into using bin or time chart, both seem like better options than my solution.

0 Karma

richgalloway
SplunkTrust
SplunkTrust

That's pretty easy to do in Splunk. It looks something like this. Change "10" to your threshold value. Change "apigee" to the real index where the data resides.

index=apigee 
| timechart span=1s count 
| where count >10

This search will produce no results if no interval is above the threshold. Set the alert to trigger if the number of results is not zero.

---
If this reply helps you, Karma would be appreciated.

efavreau
Motivator

@mdr003That's possible. Depending on where you're getting the data from with the correct index, source, sourcetype, host values, and search terms, you can get there. Here's a fictitious example, that might help you get there. This gets the data, breaks time into 1 second bins, performs a status count per the time of 1 second bins, sorts the result with largest count first, then clips it to the first 3. Season to taste.
Good luck!

index=_internal source=*scheduler.log search_type=scheduled
| bin _time AS _time span=1s
| stats count(host) AS Count BY _time host
| sort - Count
| head 3



###

If this reply helps you, an upvote would be appreciated.

mdr003
Explorer

Disregard - I was able to figure it out:

index="index" sourcetype="source"
| eval date=strftime(_time, "%M-%S")
| stats count by date
| sort -count

0 Karma
Get Updates on the Splunk Community!

What the End of Support for Splunk Add-on Builder Means for You

Hello Splunk Community! We want to share an important update regarding the future of the Splunk Add-on Builder ...

Solve, Learn, Repeat: New Puzzle Channel Now Live

Welcome to the Splunk Puzzle PlaygroundIf you are anything like me, you love to solve problems, and what ...

Building Reliable Asset and Identity Frameworks in Splunk ES

 Accurate asset and identity resolution is the backbone of security operations. Without it, alerts are ...