Alert suddenly stops sending email as alert action

Path Finder

Hello there, I hope someone can help!

a report we generated doesn't send emails, or just sporadically. also, the action time for this is very high as you can see here (field "action_time_ms"): 

10-08-2021 14:20:35.156 +0200 INFO SavedSplunker - savedsearch_id="nobody;search;SIGNL4 - High or Critical Notable Events Clone", search_type="scheduled", user="maximilian.wehner", app="search", savedsearch_name="SIGNL4 - High or Critical Notable Events Clone", priority=default, status=success, digest_mode=0, scheduled_time=1633695120, window_time=0, dispatch_time=1633695122, run_time=2.838, result_count=1, alert_actions="email", sid="scheduler_bWF4aW1pbGlhbi53ZWhuZXI__search__RMD55d86aa6233cebf27_at_1633695120_428", suppressed=0, fired=1, skipped=0, action_time_ms=509817, thread_id="AlertNotifierWorker-1", message="", workload_pool=""


action_time_ms is a LOT. so something prevents it from being sent or whatever is going on. 

usually I think splunk could send an email without configuring a mailserver but currently we want to use our o365 mailserver for it. this has been tested with another environment and there it definitely works like a charm.  here the config of the alert and the mailserver config: 

--> we've artificially set the maxtime very high to check if splunk is finally sendint the mail after a while. record was over 8 minutes until a mail was sent. 
My questions now are how can this happen? is there a way to further investigate and resolve this issue? currently this alert is mandatory for a security view and if this alert only comes every now and then, it's a main issue. 
auth_password = ****
auth_username =
from =
mailserver =
pdf.header_left = none
pdf.header_right = none
use_tls = 1
reportPaperSize = a4
hostname = somehostname
maxtime = 20m

is there something wrong with the config? What can I do to further troubleshoot this issue and hopefully resolve it? I guess this issue has come up in the past


thanks a lot for help!

Labels (2)
Tags (1)
0 Karma


Is that the only alert having this issue? Could there be a setting throttling your results? 
Could you manually recreate that alert?

0 Karma

Path Finder



yes that's the only alert having that problem. the settings are the same as within another environment and there it doesn't take 30000 ms but 500. the difference between those environments is mainly that the searchheads in which I try to use the alert stand in different networks. so I'm assuming there might be something between the searchhead and the inbox of my email that is throttleing it. could that be? 

I don't really "get" this error, since it's the first time and with an alert that has been used multiple times on different occations.

0 Karma


Strange, and I'm assuming that the problem alert isn't the only alert on that SH in that environment?

Are you able to send the results of that alert to a different email? That would answer the question if it's something between the SH and your inbox. I doubt it though.


Path Finder

Yes, when sending it to a different, non o365, email address it's going faster. 

But we figured out what might be the cause (still to be confirmed): the firewall resolves the FQDN into several ips so on our firewall, when we allow ooffice365 on a certain port, for example TLS, it won't work that well cause these IPs are bound to change every now and then thanks to microsoft. so I guess in order to fully work around this issue we either need to send the mail to a different inbox and then forwarding it OR allow all IPs from wich the mail might be sended. 


I'll update this post once confirmed (or not). fingers crossed


Good luck! It definitely sounds easier to send it to a different email and have a rule in outlook to forward that email.  Either way glad you found the issue.

0 Karma
*NEW* Splunk Love Promo!
Snag a $25 Visa Gift Card for Giving Your Review!

It's another Splunk Love Special! For a limited time, you can review one of our select Splunk products through Gartner Peer Insights and receive a $25 Visa gift card!


Or Learn More in Our Blog >>