<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Alert when &amp;quot;raises by&amp;quot; doesn't work in Alerting</title>
    <link>https://community.splunk.com/t5/Alerting/Alert-when-quot-raises-by-quot-doesn-t-work/m-p/394963#M6954</link>
    <description>&lt;PRE&gt;&lt;CODE&gt;| mcatalog values(device) as device where index=*_metric 
| mvexpand device
| join type=left device [
    | mstats max(_value) as qgets where metric_name=*.QueueGets, queue_name=_monitor, service=*nq index=*_metric earliest=-3m by device, customer
]
| eval online = if(qgets&amp;gt;0, "Online", "Offline")
| search online=Offline
&lt;/CODE&gt;&lt;/PRE&gt;</description>
    <pubDate>Sun, 24 Feb 2019 22:44:12 GMT</pubDate>
    <dc:creator>rocketboots_ser</dc:creator>
    <dc:date>2019-02-24T22:44:12Z</dc:date>
    <item>
      <title>Alert when "raises by" doesn't work</title>
      <link>https://community.splunk.com/t5/Alerting/Alert-when-quot-raises-by-quot-doesn-t-work/m-p/394961#M6952</link>
      <description>&lt;P&gt;I am trying to raise an alert when the number of results raises by 1. Each result represents a device going offline and I need to send an email every time a device goes offline. I have a scheduled search every 5 minutes because I want to remember the current state in case one device goes online and another one goes offline.&lt;/P&gt;

&lt;P&gt;When I configured the alert I had 0 results. 10 minutes later one device went offline and then 10 minutes after another one went offline. I didn't receive any email. I'm not sure if I've misinterpreted the "rises by" setting, or it's not working.  What are your thoughts?&lt;/P&gt;</description>
      <pubDate>Fri, 22 Feb 2019 07:23:59 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Alerting/Alert-when-quot-raises-by-quot-doesn-t-work/m-p/394961#M6952</guid>
      <dc:creator>rocketboots_ser</dc:creator>
      <dc:date>2019-02-22T07:23:59Z</dc:date>
    </item>
    <item>
      <title>Re: Alert when "raises by" doesn't work</title>
      <link>https://community.splunk.com/t5/Alerting/Alert-when-quot-raises-by-quot-doesn-t-work/m-p/394962#M6953</link>
      <description>&lt;P&gt;pls share your search&lt;/P&gt;</description>
      <pubDate>Fri, 22 Feb 2019 09:56:50 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Alerting/Alert-when-quot-raises-by-quot-doesn-t-work/m-p/394962#M6953</guid>
      <dc:creator>lakshman239</dc:creator>
      <dc:date>2019-02-22T09:56:50Z</dc:date>
    </item>
    <item>
      <title>Re: Alert when "raises by" doesn't work</title>
      <link>https://community.splunk.com/t5/Alerting/Alert-when-quot-raises-by-quot-doesn-t-work/m-p/394963#M6954</link>
      <description>&lt;PRE&gt;&lt;CODE&gt;| mcatalog values(device) as device where index=*_metric 
| mvexpand device
| join type=left device [
    | mstats max(_value) as qgets where metric_name=*.QueueGets, queue_name=_monitor, service=*nq index=*_metric earliest=-3m by device, customer
]
| eval online = if(qgets&amp;gt;0, "Online", "Offline")
| search online=Offline
&lt;/CODE&gt;&lt;/PRE&gt;</description>
      <pubDate>Sun, 24 Feb 2019 22:44:12 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Alerting/Alert-when-quot-raises-by-quot-doesn-t-work/m-p/394963#M6954</guid>
      <dc:creator>rocketboots_ser</dc:creator>
      <dc:date>2019-02-24T22:44:12Z</dc:date>
    </item>
    <item>
      <title>Re: Alert when "raises by" doesn't work</title>
      <link>https://community.splunk.com/t5/Alerting/Alert-when-quot-raises-by-quot-doesn-t-work/m-p/394964#M6955</link>
      <description>&lt;P&gt;I never use the alert trigger features; I always implement my own directly in the SPL of the search itself.  That way I can see the history of what is going on and debug things.  The way that you do it is to save the previous results in lookup file by using &lt;CODE&gt;| outputlookup YourSearchResultsHistoryLookupFile.csv&lt;/CODE&gt;  Although the use case is different, the mechanics of what you need to do to go this route can be found in the &lt;CODE&gt;sentinel&lt;/CODE&gt; example here:&lt;BR /&gt;
&lt;A href="https://conf.splunk.com/session/2015/conf2015-LookupTalk.pdf"&gt;https://conf.splunk.com/session/2015/conf2015-LookupTalk.pdf&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 06 Mar 2019 02:35:54 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Alerting/Alert-when-quot-raises-by-quot-doesn-t-work/m-p/394964#M6955</guid>
      <dc:creator>woodcock</dc:creator>
      <dc:date>2019-03-06T02:35:54Z</dc:date>
    </item>
    <item>
      <title>Re: Alert when "raises by" doesn't work</title>
      <link>https://community.splunk.com/t5/Alerting/Alert-when-quot-raises-by-quot-doesn-t-work/m-p/394965#M6956</link>
      <description>&lt;P&gt;From the scheduler log:&lt;BR /&gt;
If the permission of the Alert has been created as shared App or Globally, the scheduler fires the Alert as below:&lt;/P&gt;

&lt;P&gt;[INFO&amp;nbsp; SavedSplunker - savedsearch_id="nobody;search;your_alert_name“, search_type="scheduled", user="admin", app="search", savedsearch_name="your_alert_name"]&lt;/P&gt;

&lt;P&gt;Then you will note that&amp;nbsp;savedsearch_id="nobody;search;your_alert_name” does not match the&amp;nbsp;user="admin", app="search", savedsearch_name=“your_alert_name”, here the specific user is "admin" vs&amp;nbsp;"nobody"&amp;nbsp;&lt;/P&gt;

&lt;P&gt;In that case, you need to create the Alert as Private Or need to change the owner to "nobody" in local.meta manually if it’s created as App or Globally shared,&lt;BR /&gt;
E.g under $SPLUNK_HOME/etc/apps/search/metadata/local.meta &lt;/P&gt;

&lt;P&gt;From&lt;BR /&gt;&lt;BR /&gt;
[savedsearches/your_alert] &lt;BR /&gt;
owner = admin &lt;/P&gt;

&lt;P&gt;To &lt;BR /&gt;
[savedsearches/your_alert] &lt;BR /&gt;
owner = nobody &lt;/P&gt;

&lt;P&gt;Then check whether the owner has been changed for the alert by clicking "Settings" -&amp;gt; "Searches, Reports, and Alerts" and check "Owner" field for the alert. &lt;/P&gt;

&lt;P&gt;Hope it helps&lt;/P&gt;</description>
      <pubDate>Wed, 30 Sep 2020 01:00:34 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Alerting/Alert-when-quot-raises-by-quot-doesn-t-work/m-p/394965#M6956</guid>
      <dc:creator>dchoi_splunk</dc:creator>
      <dc:date>2020-09-30T01:00:34Z</dc:date>
    </item>
  </channel>
</rss>

