Alerting

Why does "Alert results" show no result when there are results?

david_resnick
Explorer

I have an alert defined which should send an email if a search returns no results for a 5 minute time period.

Enabled: Yes. 
Alert Type: Real-time.
Trigger Condition: Number of Results is < 1 in 5 minutes.
App: search
Permissions: Shared in App.

The search string is something like this:

host="production*" AND (uri_path="/web/foo" OR uri_path="/web/bar")

This is sending repeated emails, even though there have not been any 5 minute periods where the search returns 0 results.

Clicking on the "view results" link shows a 5 minute period and supposedly shows no results, but clicking on the magnifying glass to perform the search again shows that there are results for the same period.

What is going on here?

Also, clicking on "alert" link from the alert emails shows the alert definition along with " There are no fired events for this alert."

gordo32
Communicator

I've seen this same question come up a couple times, and my solution is different, so thought I'd share on a few of these in case others have the same problem I did.

The problem was that the query in my Alert was "search index=myindex sourcetype=waf httpstatus=400".

As soon as I removed the keyword "search" from the beginning of this query in the alert, it produced results consistent with manually issuing the search (index=myindex sourcetype=waf httpstatus=400). The rationale behind this (if I understood the support engineer correctly) is that the Alert passes the query to the CLI (i.e. /bin/splunk search ), so the CLI interprets the "search" item in my query as a searchable word, not a function.

0 Karma

iwks
Explorer

I'm having the exact problem you describe. Using a search with the trigger condition < 5 results in 10 minutes. The alert is always triggered, and clicking on the view results link shows nothing until the user clicks on the magnifying glass to run the search again. Hopefully someone can offer some insight.

0 Karma

david_resnick
Explorer

We reached the tentative conclusion that our problem was an overloaded system, causing a delay in indexing. We also opened a case with Splunk support (for the other issues, not related to the false alerts), so far with no results.

0 Karma
Get Updates on the Splunk Community!

Technical Workshop Series: Splunk Data Management and SPL2 | Register here!

Hey, Splunk Community! Ready to take your data management skills to the next level? Join us for a 3-part ...

Spotting Financial Fraud in the Haystack: A Guide to Behavioral Analytics with Splunk

In today's digital financial ecosystem, security teams face an unprecedented challenge. The sheer volume of ...

Solve Problems Faster with New, Smarter AI and Integrations in Splunk Observability

Solve Problems Faster with New, Smarter AI and Integrations in Splunk Observability As businesses scale ...