<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Email Alert Issue sending via AWS SES in Alerting</title>
    <link>https://community.splunk.com/t5/Alerting/Email-Alert-Issue-sending-via-AWS-SES/m-p/624728#M14582</link>
    <description>&lt;P&gt;Hi, did you manage to get a solution to this issue? Same thing happening to me after upgrading from Splunk to Splunk 8 to 9.&lt;/P&gt;&lt;P&gt;Thanks.&lt;/P&gt;</description>
    <pubDate>Mon, 19 Dec 2022 14:36:10 GMT</pubDate>
    <dc:creator>lbadmin</dc:creator>
    <dc:date>2022-12-19T14:36:10Z</dc:date>
    <item>
      <title>Email Alert Issue sending via AWS SES</title>
      <link>https://community.splunk.com/t5/Alerting/Email-Alert-Issue-sending-via-AWS-SES/m-p/448040#M12590</link>
      <description>&lt;P&gt;hi folks&lt;/P&gt;

&lt;P&gt;using splunk 7.1.1 and we're having issues sending email alerts to AWS SES - when we send via a search string the mail works just fine. I've also modified the sendemail.py file to enable TLS. Has anyone successfully managed to get this working as I have not seen any splunk answers which has solved the automatic alert issue, only when sending via a search&lt;/P&gt;

&lt;BLOCKQUOTE&gt;
&lt;P&gt;Blockquote&lt;/P&gt;
&lt;/BLOCKQUOTE&gt;

&lt;P&gt;==&amp;gt; var/log/splunk/python.log &amp;lt;==&lt;BR /&gt;
2019-08-15 15:13:02,306 +0000 ERROR sendemail:140 - Sending email. subject="Splunk Alert: Amazon Client Exception", results_link="&lt;A href="http://aws-domain:8000/app/search/@go?sid=scheduler__gearoidr__search__RMD56da3f171ecf1725d_at_1565881980_2" target="_blank"&gt;http://aws-domain:8000/app/search/@go?sid=scheduler__gearoidr__search__RMD56da3f171ecf1725d_at_1565881980_2&lt;/A&gt;", recipients="[u'&lt;A href="mailto:verifiied.user@blah.com" target="_blank"&gt;verifiied.user@blah.com&lt;/A&gt;']", server="email-smtp.us-east-1.amazonaws.com:587"&lt;BR /&gt;
2019-08-15 15:13:02,306 +0000 ERROR sendemail:463 - (554, "Transaction failed: User name is missing: 'splunk'.") while sending mail to: &lt;A href="mailto:verifiied.user@blah.com" target="_blank"&gt;verifiied.user@blah.com&lt;/A&gt;&lt;/P&gt;

&lt;P&gt;==&amp;gt; var/log/splunk/splunkd.log &amp;lt;==&lt;BR /&gt;
08-15-2019 15:13:02.306 +0000 ERROR ScriptRunner - stderr from '/opt/splunk/bin/python /opt/splunk/etc/apps/search/bin/sendemail.py "results_link=&lt;A href="http://aws-domain:8000/app/search/@go?sid=scheduler__gearoidr__search__RMD56da3f171ecf1725d_at_1565881980_2" target="_blank"&gt;http://aws-domain:8000/app/search/@go?sid=scheduler__gearoidr__search__RMD56da3f171ecf1725d_at_1565881980_2&lt;/A&gt;" "ssname=Amazon Client Exception" "graceful=True" "trigger_time=1565881981" results_file="/opt/splunk/var/run/splunk/dispatch/scheduler_&lt;EM&gt;gearoidr&lt;/EM&gt;&lt;EM&gt;search&lt;/EM&gt;_RMD56da3f171ecf1725d_at_1565881980_2/results.csv.gz"':  ERROR:root:(554, "Transaction failed: User name is missing: 'splunk'.") while sending mail to: &lt;A href="mailto:verifiied.user@blah.com" target="_blank"&gt;verifiied.user@blah.com&lt;/A&gt;&lt;BR /&gt;
08-15-2019 15:13:13.160 +0000 INFO  TcpOutputProc - Connected to idx=10.0.26.132:9997, pset=0, reuse=0.&lt;/P&gt;

&lt;BLOCKQUOTE&gt;
&lt;P&gt;Blockquote&lt;/P&gt;

&lt;P&gt;Blockquote&lt;/P&gt;
&lt;/BLOCKQUOTE&gt;

&lt;P&gt;The logs from a successful search email event&lt;BR /&gt;
==&amp;gt; var/log/splunk/python.log &amp;lt;==&lt;BR /&gt;
2019-08-15 15:28:55,535 +0000 INFO  sendemail:1299 - Generated PDF for email&lt;BR /&gt;
2019-08-15 15:28:55,880 +0000 INFO  sendemail:137 - Sending email. subject="Here is an email notification", results_link="None", recipients="[u'&lt;A href="mailto:verifiied.user@blah.com" target="_blank"&gt;verifiied.user@blah.com&lt;/A&gt;']", server="email-smtp.us-east-1.amazonaws.com:587"&lt;/P&gt;

&lt;BLOCKQUOTE&gt;
&lt;P&gt;Blockquote&lt;/P&gt;
&lt;/BLOCKQUOTE&gt;</description>
      <pubDate>Wed, 30 Sep 2020 01:44:55 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Alerting/Email-Alert-Issue-sending-via-AWS-SES/m-p/448040#M12590</guid>
      <dc:creator>gearoidrogers</dc:creator>
      <dc:date>2020-09-30T01:44:55Z</dc:date>
    </item>
    <item>
      <title>Re: Email Alert Issue sending via AWS SES</title>
      <link>https://community.splunk.com/t5/Alerting/Email-Alert-Issue-sending-via-AWS-SES/m-p/618609#M14455</link>
      <description>&lt;P&gt;We are also facing the same issue with latest splunk version 9.01&lt;BR /&gt;I have tried using port:465 with enable ssl as well.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 27 Oct 2022 16:18:50 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Alerting/Email-Alert-Issue-sending-via-AWS-SES/m-p/618609#M14455</guid>
      <dc:creator>pavan_fwd</dc:creator>
      <dc:date>2022-10-27T16:18:50Z</dc:date>
    </item>
    <item>
      <title>Re: Email Alert Issue sending via AWS SES</title>
      <link>https://community.splunk.com/t5/Alerting/Email-Alert-Issue-sending-via-AWS-SES/m-p/624728#M14582</link>
      <description>&lt;P&gt;Hi, did you manage to get a solution to this issue? Same thing happening to me after upgrading from Splunk to Splunk 8 to 9.&lt;/P&gt;&lt;P&gt;Thanks.&lt;/P&gt;</description>
      <pubDate>Mon, 19 Dec 2022 14:36:10 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Alerting/Email-Alert-Issue-sending-via-AWS-SES/m-p/624728#M14582</guid>
      <dc:creator>lbadmin</dc:creator>
      <dc:date>2022-12-19T14:36:10Z</dc:date>
    </item>
  </channel>
</rss>

