Alerting

how to trigger alert if exceeds more than 100 times of 403 status code in a second

dhavamanis
Builder

Can you please tell me, how to trigger alert if exceeds more than 100 times of 403 status code in a second and real time (threshold).

Tags (2)
0 Karma
1 Solution

yannK
Splunk Employee
Splunk Employee

Realtime is not the best for such measures, but if you are ready to pay the price, and have false positives.

  • you can filter the events with the 403 status code
  • the _time in splunk is already in second (epoch time), therefore you can count the number of events per second
  • add a condition for the count > 100

<mywonderfullsearch> status_code=403 | stats count by _time | where count > 100 | convert ctime(_time) AS time

  • test
  • then pick a realtime window that is not too large let's say last 10 minutes
  • then schedule it
  • and add an alert trigger "number of results > 0"
  • setup the email for the alert
  • if needed check the "inline results" to add the details.

Remarks :

  • uncheck any alert retentions, if you have too many alert (let's say one per second....) , the preserved search result will fill your dispatch folder and impact your server
    see alert.suppress and alert.suppress.period in http://docs.splunk.com/Documentation/Splunk/6.1.1/Admin/Savedsearchesconf

  • add details to the search results
    As an improvement I would replace stats count by _time by stats count values(host) by _time
    to add the list of all the concerned hosts in the alert.

  • skip from realtime to historical search
    To avoid false positive I still recommend to run the search as a historical search. by example every 5 minutes, over the earliest=-7m@m latest -2m@m (to add a 2 minute delay to account for the possible indexing delay)

View solution in original post

yannK
Splunk Employee
Splunk Employee

Realtime is not the best for such measures, but if you are ready to pay the price, and have false positives.

  • you can filter the events with the 403 status code
  • the _time in splunk is already in second (epoch time), therefore you can count the number of events per second
  • add a condition for the count > 100

<mywonderfullsearch> status_code=403 | stats count by _time | where count > 100 | convert ctime(_time) AS time

  • test
  • then pick a realtime window that is not too large let's say last 10 minutes
  • then schedule it
  • and add an alert trigger "number of results > 0"
  • setup the email for the alert
  • if needed check the "inline results" to add the details.

Remarks :

  • uncheck any alert retentions, if you have too many alert (let's say one per second....) , the preserved search result will fill your dispatch folder and impact your server
    see alert.suppress and alert.suppress.period in http://docs.splunk.com/Documentation/Splunk/6.1.1/Admin/Savedsearchesconf

  • add details to the search results
    As an improvement I would replace stats count by _time by stats count values(host) by _time
    to add the list of all the concerned hosts in the alert.

  • skip from realtime to historical search
    To avoid false positive I still recommend to run the search as a historical search. by example every 5 minutes, over the earliest=-7m@m latest -2m@m (to add a 2 minute delay to account for the possible indexing delay)

dhavamanis
Builder

I have updated the below search query for this alert, Please correct me if anything wrong.

sourcetype=acquiasyslog and status=403 | stats count by _time , uri_path| where count>10

0 Karma

dhavamanis
Builder

Thank you so much for the details, Can you please help me on this, We need to setup alert for the condition, "sourcetype=acquiasyslog and status=403 | stats count by uripath". to trigger alert uripath count of each results exceed more than 10 times in a second.

0 Karma
Get Updates on the Splunk Community!

Splunk Decoded: Service Maps vs Service Analyzer Tree View vs Flow Maps

It’s Monday morning, and your phone is buzzing with alert escalations – your customer-facing portal is running ...

What’s New in Splunk Observability – September 2025

What's NewWe are excited to announce the latest enhancements to Splunk Observability, designed to help ITOps ...

Fun with Regular Expression - multiples of nine

Fun with Regular Expression - multiples of nineThis challenge was first posted on Slack #regex channel ...