How do I troubleshoot why Splunk is not sending alert emails?


Hi everyone.

I've been searching on other 'answers' but no one have solved my problem.

I have some alerts that, depending on the conditions, send an email with details of the incident. It's been a few days that I'm not receiving any email from Splunk.

I forced the alert situation and it did not send any email. The alert also is not appearing in the triggered alerts. When I run the search, the results are shown.

I have already checked the following settings:

  1. alert_actions.conf
  2. a query 'ex = _internal source = * scheduler.log'

I forced the sending of an email by the search:

index = _internal | head 1 | sendemail to = "" format = "html" server = 587 use_tls = 1

and it sends the email.

Does anyone have other tips to investigate?

Tks so much.

0 Karma


Hi there,

I've spent 2 days trying to configure the Splunk emails.

Not sure what the settings in alert_actions.conf file are but by default the sender is set to"splunk_sender" or something like in order to send an email you have to specify the "from" property. Also you can change it in Settings > Searches, reports and alerts. Find your search or alert (if you want to use the alert email action) click Edit > Advanced edit and then change the value of "".

Also trying to send an email using gmail's smtp is a bit tricky as probably you have 2-layer authentication. I would recommend you to change the server.

0 Karma


look at your _internal logs. splunkd, python and /var/logs. There will be something there.

index=_internal sendemail

0 Karma

Splunk Employee
Splunk Employee

What app is your alert running in? Can you check the alerts page and open it in search and see if you see the events you expect?

can you check for your search name in this search: index=_internal source=*splunkd.log alert

For example here is a slack alert action firing when my search finds logins in nix logs:

alt text