Alerting

How to check if the events count is greater than threshold in a specific timeframe?

Nidd
Path Finder

Requirement:

I have a ton of events and I need to create an alert that keeps monitoring my job for the number of events it processed for the last 1 hour. It should alert whenever the events count exceeds a specific threshold.

I have the below query framed. But it is not showing results at all, even when there are results to be shown.

 

index=myIndex "myJob" earliest=-1h latest=now |  stats count  as eventsCount by _time | where eventsCount > 5000

 

Where am I making a mistake? Please help.

Labels (2)
Tags (2)
0 Karma

The_Data_Pirate
Splunk Employee
Splunk Employee

Hey Nidd,

I have had a little play, please try the below search and see if it works for your use case. I've put the count in buckets of 5m chunks with the span argument. Feel free to change this to what ever works for you.

 

| timechart count span=5m 
| eval outlier=if(count>5000,1,0) 
| search outlier=1
Tags (1)
0 Karma
Get Updates on the Splunk Community!

Automatic Discovery Part 1: What is Automatic Discovery in Splunk Observability Cloud ...

If you’ve ever deployed a new database cluster, spun up a caching layer, or added a load balancer, you know it ...

Real-Time Fraud Detection: How Splunk Dashboards Protect Financial Institutions

Financial fraud isn't slowing down. If anything, it's getting more sophisticated. Account takeovers, credit ...

Splunk + ThousandEyes: Correlate frontend, app, and network data to troubleshoot ...

 Are you tired of troubleshooting delays caused by siloed frontend, application, and network data? We've got a ...