Splunk Search
Highlighted

What is a good query to monitor for someone sending too many alerts?

Communicator

I received an email from ES techs that someone had sent over 128k alerts to the same address in a 24 hour period.
I tracked it down to two private alerts and disabled them.
Researching further those emailed alerts were just among those successfully sent. Because a lot of people did not get their alerts or scheduled reports for that day.

Here is a sampling of the error message from internals:
04-09-2019 09:15:07.768 -0500 ERROR ScriptRunner - stderr from '/opt/splunk/bin/python /opt/splunk/etc/apps/search/bin/sendemail.py "results
link=http://TheSearchHeadServer:8000/app/ALL_my_sales/@go?sid=scheduler__karlA28BCL_ZGlnaXRhbF9zYWxlcw__R..." "ssname=Null Pointer" "graceful=True" "triggertime=1554819306" resultsfile="/opt/splunk/var/run/splunk/dispatch/schedulerkarlA28BCL_ZGlnaXRhbF9zYWxlcwRMDL559a15d8ba081a9e5at1554819300_52888/results.csv.gz"': ERROR:root:(452, '4.3.1 Insufficient system storage', u'SplunkSH@gmail.com') while sending mail to: karlA28BCL@gmail.com

0 Karma
Highlighted

Re: What is a good query to monitor for someone sending too many alerts?

Communicator

Here is the query I have so far:
host=SplunkSH index=_internal "-0500 ERROR ScriptRunner - stderr from '/opt/splunk/bin/python /opt/splunk/etc/apps/search/bin/sendemail.py" "Insufficient system storage'" "while sending mail to:"

0 Karma