Alerting

Why does "Alert results" show no result when there are results?

david_resnick
Explorer

I have an alert defined which should send an email if a search returns no results for a 5 minute time period.

Enabled: Yes. 
Alert Type: Real-time.
Trigger Condition: Number of Results is < 1 in 5 minutes.
App: search
Permissions: Shared in App.

The search string is something like this:

host="production*" AND (uri_path="/web/foo" OR uri_path="/web/bar")

This is sending repeated emails, even though there have not been any 5 minute periods where the search returns 0 results.

Clicking on the "view results" link shows a 5 minute period and supposedly shows no results, but clicking on the magnifying glass to perform the search again shows that there are results for the same period.

What is going on here?

Also, clicking on "alert" link from the alert emails shows the alert definition along with " There are no fired events for this alert."

gordo32
Communicator

I've seen this same question come up a couple times, and my solution is different, so thought I'd share on a few of these in case others have the same problem I did.

The problem was that the query in my Alert was "search index=myindex sourcetype=waf httpstatus=400".

As soon as I removed the keyword "search" from the beginning of this query in the alert, it produced results consistent with manually issuing the search (index=myindex sourcetype=waf httpstatus=400). The rationale behind this (if I understood the support engineer correctly) is that the Alert passes the query to the CLI (i.e. /bin/splunk search ), so the CLI interprets the "search" item in my query as a searchable word, not a function.

0 Karma

iwks
Explorer

I'm having the exact problem you describe. Using a search with the trigger condition < 5 results in 10 minutes. The alert is always triggered, and clicking on the view results link shows nothing until the user clicks on the magnifying glass to run the search again. Hopefully someone can offer some insight.

0 Karma

david_resnick
Explorer

We reached the tentative conclusion that our problem was an overloaded system, causing a delay in indexing. We also opened a case with Splunk support (for the other issues, not related to the false alerts), so far with no results.

0 Karma
Get Updates on the Splunk Community!

CX Day is Coming!

Customer Experience (CX) Day is on October 7th!! We're so excited to bring back another day full of wonderful ...

Strengthen Your Future: A Look Back at Splunk 10 Innovations and .conf25 Highlights!

The Big One: Splunk 10 is Here!  The moment many of you have been waiting for has arrived! We are thrilled to ...

Now Offering the AI Assistant Usage Dashboard in Cloud Monitoring Console

Today, we’re excited to announce the release of a brand new AI assistant usage dashboard in Cloud Monitoring ...