<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic PagerDuty autoresolve of alerts in All Apps and Add-ons</title>
    <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/PagerDuty-autoresolve-of-alerts/m-p/407759#M49697</link>
    <description>&lt;P&gt;Hi,&lt;/P&gt;

&lt;P&gt;Is it possible for splunk to resolve PD alerts after it has gone below the threshold?&lt;/P&gt;

&lt;P&gt;Regards&lt;BR /&gt;
Silvano&lt;/P&gt;</description>
    <pubDate>Fri, 30 Nov 2018 11:48:14 GMT</pubDate>
    <dc:creator>silvanop</dc:creator>
    <dc:date>2018-11-30T11:48:14Z</dc:date>
    <item>
      <title>PagerDuty autoresolve of alerts</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/PagerDuty-autoresolve-of-alerts/m-p/407759#M49697</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;

&lt;P&gt;Is it possible for splunk to resolve PD alerts after it has gone below the threshold?&lt;/P&gt;

&lt;P&gt;Regards&lt;BR /&gt;
Silvano&lt;/P&gt;</description>
      <pubDate>Fri, 30 Nov 2018 11:48:14 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/PagerDuty-autoresolve-of-alerts/m-p/407759#M49697</guid>
      <dc:creator>silvanop</dc:creator>
      <dc:date>2018-11-30T11:48:14Z</dc:date>
    </item>
    <item>
      <title>Re: PagerDuty autoresolve of alerts</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/PagerDuty-autoresolve-of-alerts/m-p/504275#M62120</link>
      <description>&lt;P&gt;Hey Silvano.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Yes there is.&lt;/P&gt;&lt;P&gt;You can actually trigger and resolve pagerduty's using the same alert even.&lt;/P&gt;&lt;P&gt;Take a look at the following example code.&lt;/P&gt;&lt;P&gt;*In production you would probably put this into a macro and pass the event_action as an argument...&lt;/P&gt;&lt;P&gt;index=_internal ERROR&lt;BR /&gt;| stats count as event_count&lt;BR /&gt;| eval dedup_key="ddddd"&lt;BR /&gt;| eval severity="warning"&lt;BR /&gt;| eval event_action=case(event_count&amp;gt;0,"trigger",1=1,"resolve")&lt;BR /&gt;| eval summary="A summary of this event"&lt;BR /&gt;| eval source="a.server.example.com"&lt;BR /&gt;| eval routing_key="SOME_ROUTING_KEY"&lt;BR /&gt;| table dedup_key,severity,event_action, summary, source, routing_key&lt;/P&gt;&lt;P&gt;Basically the fields above are the minimum for a pagerduty alert.&lt;/P&gt;&lt;P&gt;When there is one or more results the action will be to trigger an incident, when none it will send a resolve.The dedup key will end up being the name of the search so you don't need to specify.&lt;/P&gt;&lt;P&gt;*note, the stats count is in case there are no results as you need something to raise an event and send a resolve. This also means this only works for a single alert.&lt;/P&gt;&lt;P&gt;In order for this to work you need to use event rules in pagerduty.&lt;/P&gt;&lt;P&gt;Creeate a new event rule and create a minimum of two rules:&lt;/P&gt;&lt;P&gt;-The first will be resolve. ie if result.event_action=resolve then resolve.&lt;/P&gt;&lt;P&gt;-The second will be trigger. is if event_action=trigger then raise an incident.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;There are other things you may want to do like repeat step 2 for each severity.&lt;/P&gt;&lt;P&gt;And that should get you auto resolving pagerduty's.&lt;/P&gt;&lt;P&gt;That was the best way i could find.If you found anything better since let me know.&lt;/P&gt;</description>
      <pubDate>Sat, 13 Jun 2020 14:07:00 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/PagerDuty-autoresolve-of-alerts/m-p/504275#M62120</guid>
      <dc:creator>jethrop</dc:creator>
      <dc:date>2020-06-13T14:07:00Z</dc:date>
    </item>
  </channel>
</rss>

