We have come across a strange situation where email notifications are not working for some alerts only and not being sent to recipients.
Alert is configured to fire alerts as per the below:
Condition: number of events > 0
Alert mode: once per searchhead
Though as checked into the logs in python and scheduler getting as below:
2017-05-17 13:30:14,023 +0200 INFO sendemail:128 - Sending email. subject="Splunk Alert: Violet - DNS Killswitch Domain", results_link="https://splunk-web.irmip.aws.xxx-cloud.com:443/app/xxx-irm-ip-app/@go?sid=scheduler_Y2hyaXN0b3BoZXIuYmVhcmRAc2hlbGwuY29t_c2hlbGwtaXJtLWlwLWFwcA__RMD5cd4d617bba44680d_at_1495020600_20380_DF46786F-27DA-4875-91AD-05FD59BB8A5F", recipients="[u'Christopher.Beard@xxx.com', u'Robert.Mora@xxx.com', u'Andreas.Sfakianakis@xxx.com', u'Ditmar.DenEngelsen@xxx.com', u'DeMar.Joseph@xxx.com']", server="email-smtp.eu-west-1.xxx.com:25" host = ip-10-0-160-218.eu-west-1.compute.internal source = /opt/splunk/var/log/splunk/python.log sourcetype = splunk_python
05-17-2017 12:30:14.055 +0100 INFO SavedSplunker - savedsearch_id="nobody;xxx-irm-ip-app;Violet - DNS Killswitch Domain", search_type="", user="email@example.com", app="xxx-irm-ip-app", savedsearch_name="Violet - DNS Killswitch Domain", priority=default, status=success, digest_mode=1, scheduled_time=1495020600, window_time=0, dispatch_time=1495020610, run_time=2.028, result_count=3, alert_actions="email", sid="scheduler_Y2hyaXN0b3BoZXIuYmVhcmRAc2hlbGwuY29t_c2hlbGwtaXJtLWlwLWFwcA__RMD5cd4d617bba44680d_at_1495020600_20380_DF46786F-27DA-4875-91AD-05FD59BB8A5F", suppressed=0, thread_id="AlertNotifierWorker-1" host = ip-10-0-160-218.eu-west-1.compute.internal source = /opt/splunk/var/log/splunk/scheduler.log sourcetype = scheduler
Can somebody help in troubleshooting this, if i am missing something ?
It seems like you are in a search head cluster and perhaps the
did your failed alerts have attachments?
because, if attachment has dimensions greater than your eMail limits, your message is blocked by eMail server.