Alerting

How to troubleshoot Splunk email notification

splunktp
Explorer

Hi All,

We have Splunk 4.1.4 setup to send email alert notification, however we are not receiving any. What log files can we check to troubleshoot?

From the Splunk Indexer, we can ping and telnet to the Mail host set but were still not able to receive email alerts.

Thanks Everyone

Tags (3)
1 Solution

jbsplunk
Splunk Employee
Splunk Employee

The logs you are interested in seeing will be in $SPLUNK_HOME/var/log/splunk/

Be sure to double check your saved search is configured to trigger. Often people don't realize its set to fire only if an alert rises by 1 or 10 , and the last run of the search returned less new results than required to trigger.

If that looks ok, the next thing to check would be scheduler.log. Look for the name of your saved search. You should be able to tell from there if any action was triggered. Presuming that was done, email alerts are sent out by python and you should check the python.log to see what is happening. If there was an error sending to the mailserver, you'd see the problem displayed there.

Also see: http://www.splunk.com/wiki/Community:TroubleshootingAlertScripts

View solution in original post

Jarohnimo
Builder

I have this exact same Problem I have and doesn't look like there was real solution to his problem. "Connection Unexpectedly closed".

I have the same message. I have SMTP working on my original splunk box. However it won't work on my new splunk box. The variable is that the first splunk server is bound by a Cert over https:// The new splunk box doesn't use SSL.

Do you think the SSL connection may have something to do whith the SMTP server rejecting a request or not? I understand this isn't directly splunk but others have got this message. We need help

0 Karma

jbsplunk
Splunk Employee
Splunk Employee

The logs you are interested in seeing will be in $SPLUNK_HOME/var/log/splunk/

Be sure to double check your saved search is configured to trigger. Often people don't realize its set to fire only if an alert rises by 1 or 10 , and the last run of the search returned less new results than required to trigger.

If that looks ok, the next thing to check would be scheduler.log. Look for the name of your saved search. You should be able to tell from there if any action was triggered. Presuming that was done, email alerts are sent out by python and you should check the python.log to see what is happening. If there was an error sending to the mailserver, you'd see the problem displayed there.

Also see: http://www.splunk.com/wiki/Community:TroubleshootingAlertScripts

View solution in original post

jbsplunk
Splunk Employee
Splunk Employee

Splunk sends the email already composed to the mailserver, and whatever error comes back to us is what we've received from the stack. Splunk doesn't tell you why the connection was closed, because it doesn't know why the connection was closed. You'll need to do some footwork on your side to figure that out. I would imagine a good place to start would be a tcpdump. It probably has something to do with either a network or mail server misconfiguation.

0 Karma

splunktp
Explorer

thanks for the info. my python log shows the same result for each search that I have configured to run:

[root@splunk splunk]# tail -f python.log
2011-10-17 23:00:29,209 ERROR Sending email. subject="Splunk Alert: My Splunk Report", results_link="https:///app/search/@go?sid=scheduler_nobody_search_QWNjZXNzIC0gU2VydmljZSBBY2NvdW50IEludGVyYWN0aXZlIExvZ2lucw_at_1318872600_1835711220", recepients="['splunkusers@mycompany.com']"

2011-10-17 23:00:29,210 ERROR Connection unexpectedly closed while sending mail to: splunkusers@mycompany.com

However it does not shows why the connection was unexpectedly closed.
I had changed my Mail host to be the same as the host name of my splunk server (CentOS 5). What do you think is wrong here? Thanks

0 Karma
Register for .conf21 Now! Go Vegas or Go Virtual!

How will you .conf21? You decide! Go in-person in Las Vegas, 10/18-10/21, or go online with .conf21 Virtual, 10/19-10/20.