Alerts no longer email, however they do show up in triggered alerts. This started sometime yesterday, before that we have been getting email alerts and creating email alerts for months.
var\log\splunk\python.log shows:
2018-08-16 15:44:41,450 PDT ERROR sendemail:115 - Sending email. subject="testalert010", results_link="http://ABCSPLNK02:8000/app/abc_ist/search?q=%7Cloadjob%20rt_scheduler__username_ZHd0X2lzdA__testtest_at_1534459452_73.0%20%7C%20head%201%20%7C%20tail%201&earliest=0&latest=now", recipients="[u'myemailaddress@abc.com']", server="localhost"
2018-08-16 15:44:41,450 PDT ERROR sendemail:378 - [Errno 10061] No connection could be made because the target machine actively refused it while sending mail to: u'myemailaddress@abc.com
We've checked and there is no firewall/exchange/mimecast/smtp blocking the emails. I have been messing with limits.conf, savedsearches.conf, and alert_actions.conf for completely separate reasons, and those changes should not have affected email alerting. I went back and deleted every one of those mentioned .conf files that have been edited in the last three days (in the local directories, so the default files have taken back over) and still no luck.
I also installed the Slack app in the last three days, is it possible that the installation overrode SMTP settings in Splunk or something? (I have since deleted the Slack app in case that was the reason why, without it fixing the problem)
Have looked at all Splunk answers for this issue to no avail.
... View more