Alerting
Highlighted

Alert when "raises by" doesn't work

I am trying to raise an alert when the number of results raises by 1. Each result represents a device going offline and I need to send an email every time a device goes offline. I have a scheduled search every 5 minutes because I want to remember the current state in case one device goes online and another one goes offline.

When I configured the alert I had 0 results. 10 minutes later one device went offline and then 10 minutes after another one went offline. I didn't receive any email. I'm not sure if I've misinterpreted the "rises by" setting, or it's not working. What are your thoughts?

0 Karma
Highlighted

Re: Alert when "raises by" doesn't work

SplunkTrust
SplunkTrust

pls share your search

0 Karma
Highlighted

Re: Alert when "raises by" doesn't work

| mcatalog values(device) as device where index=*_metric 
| mvexpand device
| join type=left device [
    | mstats max(_value) as qgets where metric_name=*.QueueGets, queue_name=_monitor, service=*nq index=*_metric earliest=-3m by device, customer
]
| eval online = if(qgets>0, "Online", "Offline")
| search online=Offline
0 Karma
Highlighted

Re: Alert when "raises by" doesn't work

Esteemed Legend

I never use the alert trigger features; I always implement my own directly in the SPL of the search itself. That way I can see the history of what is going on and debug things. The way that you do it is to save the previous results in lookup file by using | outputlookup YourSearchResultsHistoryLookupFile.csv Although the use case is different, the mechanics of what you need to do to go this route can be found in the sentinel example here:
https://conf.splunk.com/session/2015/conf2015-LookupTalk.pdf

0 Karma
Highlighted

Re: Alert when "raises by" doesn't work

Splunk Employee
Splunk Employee

From the scheduler log:
If the permission of the Alert has been created as shared App or Globally, the scheduler fires the Alert as below:

[INFO  SavedSplunker - savedsearchid="nobody;search;youralertname“, searchtype="scheduled", user="admin", app="search", savedsearchname="youralert_name"]

Then you will note that savedsearchid="nobody;search;youralertname” does not match the user="admin", app="search", savedsearchname=“youralertname”, here the specific user is "admin" vs "nobody" 

In that case, you need to create the Alert as Private Or need to change the owner to "nobody" in local.meta manually if it’s created as App or Globally shared,
E.g under $SPLUNK_HOME/etc/apps/search/metadata/local.meta

From

[savedsearches/your_alert]
owner = admin

To
[savedsearches/your_alert]
owner = nobody

Then check whether the owner has been changed for the alert by clicking "Settings" -> "Searches, Reports, and Alerts" and check "Owner" field for the alert.

Hope it helps

View solution in original post

0 Karma