I am trying to set up a custom alert that triggers when I receive more than 50 emails from any given address in the space of 30 minutes. I am using this to alert for spam. So far i have got the alert working and once triggered it does not alert again for another 2 hours. The issue i have is that after 2 hours Splunk will alert me again saying firstname.lastname@example.org has sent 200 emails. As I am already aware that i have received excess emails from this sender I would like Splunk to not alert on that specific address but keep alerting on anything new that comes up.
Is such a thing possible?
One thought I had is that I run a scheduled search every 30 minutes and only ask it to look at the previous 30 minutes, that way it will pick up anything new. However that means if the hits of spam are spread over the course of an hour for example then i will get multiple alerts for each hit is over 50 which presents the previous issue.
take this run everywhere example and adapt it to your needs:
index=_internal source=*metrics.log earliest=-30min@min latest=-0min@min NOT [ search index=_internal source=*metrics.log earliest=-60min@min latest=-30min@min | dedup series | table series ] | stats count by series
this example will search for all series in metrics.log for a time range 60 minutes to 30 minutes ago and use this result to compare with the same search but with a time range 30 minutes to 0 minutes ago. The resulting stats will only show new series events for the last 30 minutes.