Hi,
I'm having an issue with my Splunk server. I'm trying to setup some alerts, and have entered all my email relay data under the email settings. As a test, I created an alert that contains the following search:
index=_internal | head 1
The alert is triggered, but I don't receive an email. I have checked the mail relay and it seems like the message is going through. But I still don't receive anything. As another test I did the following search:
index=_internal | head 1 | sendemail to="<my email address>" format=raw sendresults=1 server=<smtp relay> footer="Sent from Splunk." from="SplunkAlerts" subject="Splunk Alert" message="The following Splunk Alert has been fired:"
When I run this search I receive the email. Is their something I'm missing in my configuration for the alerts? Any help that you can provide would be greatly appreciated.
I found the issue. It wasn't with Splunk or the mail relay. The external exchange server that we need to use (provided by our parent company) was marking it as spam. Still can't figure out why the manual search didn't get marked as spam, but the alert did; however, its working now. Thanks for all your help.
I found the issue. It wasn't with Splunk or the mail relay. The external exchange server that we need to use (provided by our parent company) was marking it as spam. Still can't figure out why the manual search didn't get marked as spam, but the alert did; however, its working now. Thanks for all your help.
Hi @jbouch03,
How did you find this issue?
How did you identify that the external server was marking the email as spam. Is there a way we can search for all the spam marked emails in splunk?
Check the ~/splunk/var/log/splunk/python.log which is where all the sendemail errors will be written.
Thank you for this!! I was having the exact same issue and couldn't figure out why until I read this thread. Apparently the Splunk SH we were using wasn't setup to send mail in general. We tried another search head and it worked perfectly. This helped A LOT!
have you configured mail servers on splunk side ?
check the alerts.conf file.
the alert_actions.conf is configured. Is there a separate .conf file that needs to be configured?
on splunk 7.3.1 there is no such thing as an alerts.conf file
Anything suspicious in index=_internal source=*python.log
?
As far as I can tell everything looks correct. I get the INFO statements but I don't see any ERROR or WARN flags.