<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: troubleshooting email alerts in splunk in Alerting</title>
    <link>https://community.splunk.com/t5/Alerting/troubleshooting-email-alerts-in-splunk/m-p/439025#M11544</link>
    <description>&lt;P&gt;I can't see past 06-23 - I have no idea why I wouldn't be seeing logs beyond that date.&lt;/P&gt;

&lt;P&gt;So, &lt;/P&gt;

&lt;P&gt;Activity -&amp;gt; Jobs (Nothing past 6/23 though I've been running this for weeks and it's been working)&lt;BR /&gt;
Activity -&amp;gt; Triggered Alerts (empty - though I think this is due to older than 6/23)&lt;BR /&gt;
Splunkd (zero sendemail logs in the file)&lt;/P&gt;</description>
    <pubDate>Mon, 24 Jun 2019 14:52:12 GMT</pubDate>
    <dc:creator>mburgess97</dc:creator>
    <dc:date>2019-06-24T14:52:12Z</dc:date>
    <item>
      <title>troubleshooting email alerts in splunk</title>
      <link>https://community.splunk.com/t5/Alerting/troubleshooting-email-alerts-in-splunk/m-p/439023#M11542</link>
      <description>&lt;P&gt;I have several alerts that have been firing off an email.  Everything has been working for several weeks.  However, I noticed over the weekend that search results that should have triggered did not send an email.  I don't see any errors in the scheduler.log file.&lt;/P&gt;

&lt;P&gt;Any other recommendations for troubleshooting this issue?&lt;/P&gt;</description>
      <pubDate>Mon, 24 Jun 2019 13:26:19 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Alerting/troubleshooting-email-alerts-in-splunk/m-p/439023#M11542</guid>
      <dc:creator>mburgess97</dc:creator>
      <dc:date>2019-06-24T13:26:19Z</dc:date>
    </item>
    <item>
      <title>Re: troubleshooting email alerts in splunk</title>
      <link>https://community.splunk.com/t5/Alerting/troubleshooting-email-alerts-in-splunk/m-p/439024#M11543</link>
      <description>&lt;P&gt;Look in the jobs log (Activity-&amp;gt;Jobs) to see if the jobs ran then check the alerts log (Activity-&amp;gt;Triggered Alerts) to see if they triggered alerts.  Then check splunkd.log for "sendemail" to see if there were problems sending mail.&lt;/P&gt;</description>
      <pubDate>Mon, 24 Jun 2019 14:35:38 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Alerting/troubleshooting-email-alerts-in-splunk/m-p/439024#M11543</guid>
      <dc:creator>richgalloway</dc:creator>
      <dc:date>2019-06-24T14:35:38Z</dc:date>
    </item>
    <item>
      <title>Re: troubleshooting email alerts in splunk</title>
      <link>https://community.splunk.com/t5/Alerting/troubleshooting-email-alerts-in-splunk/m-p/439025#M11544</link>
      <description>&lt;P&gt;I can't see past 06-23 - I have no idea why I wouldn't be seeing logs beyond that date.&lt;/P&gt;

&lt;P&gt;So, &lt;/P&gt;

&lt;P&gt;Activity -&amp;gt; Jobs (Nothing past 6/23 though I've been running this for weeks and it's been working)&lt;BR /&gt;
Activity -&amp;gt; Triggered Alerts (empty - though I think this is due to older than 6/23)&lt;BR /&gt;
Splunkd (zero sendemail logs in the file)&lt;/P&gt;</description>
      <pubDate>Mon, 24 Jun 2019 14:52:12 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Alerting/troubleshooting-email-alerts-in-splunk/m-p/439025#M11544</guid>
      <dc:creator>mburgess97</dc:creator>
      <dc:date>2019-06-24T14:52:12Z</dc:date>
    </item>
    <item>
      <title>Re: troubleshooting email alerts in splunk</title>
      <link>https://community.splunk.com/t5/Alerting/troubleshooting-email-alerts-in-splunk/m-p/439026#M11545</link>
      <description>&lt;P&gt;When you are looking at historic data, you may see something that is there now, but may not have made it in to the system in time when when the job ran because of indexing lag&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;index=searched index host=host_that_did_not_get_triggered  other relevant filters
| eval lagSecs = _indextime - _time 
| timechart avg(lagSecs)
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;If you are unable to see any results for this job, then perhaps the search has not run in a while. You did mention you did not see any errors in scheduler log. Is this search actually running when scheduled?&lt;/P&gt;</description>
      <pubDate>Mon, 01 Jul 2019 02:30:06 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Alerting/troubleshooting-email-alerts-in-splunk/m-p/439026#M11545</guid>
      <dc:creator>nvanderwalt_spl</dc:creator>
      <dc:date>2019-07-01T02:30:06Z</dc:date>
    </item>
  </channel>
</rss>

