Alerting

Can you help us set up a Splunk alert for a value below threshold?

wjared
Explorer

I have a search that generates a graph. The graph is generated with data that may/may not be within our threshold value that we will use as a custom trigger parameter. I want to set up an Alert for when a new data point in that graph exceeds the threshold value. I tried simply using a custom trigger as:

|y-axis-name| <= |threshold|

I have set a high threshold that we're already exceeding just to test if the alert would work, but it doesn't. The only reason for this that I can think of is that |y-axis-name| needs to be clearly defined as a variable in the search, THEN used in the custom trigger statement. Is that true?

0 Karma

woodcock
Esteemed Legend

IMHO, you should only use the alert trigger combination of number of results and is greater than 0 and keep the threshold condition inside SPL. That way it is inherently obvious to any investigator exactly how/what triggered. In this design, yes, the variable to examine must exist in the search results. If the threshold is also a field name, then you must use | where. If you are hard-coding a value, you can use either | search or | where.

0 Karma

whrg
Motivator

Yes, the variables in the trigger condition must exist in the search.
Also, if you have multiple events/rows then perhaps the trigger condition will only check the first row.

Try adding the following part to your search:

... | eventstats last(y-axis-name) as lastvalue

And then have a trigger condition such as "search lastvalue <= 1000"

wjared
Explorer

Apologies if I'm repeating this response, but after typing in the "Post Your Answer To This Question" field and submitting, I don't see what I typed anywhere, so I'm doing it again just in case:

Thanks so much for the response. Unfortunately, the trigger condition isn't recognizing the variable I defined. I followed the syntax of your suggestion and tried a couple modifications to no end - I think the problem now lies elsewhere. My assumption is that because we are parsing log data for a specific value to generate our y-axis, and our Splunk search is pretty cryptic (mostly regex), we aren't defining this value clearly enough in our search to be assigned to a variable that we can then use in a trigger condition. I will see if there is a better way to write our search so the variable assignment / trigger condition variable will work. Any more insight you may have on this issue would be greatly valued. Thanks so much 🙂

0 Karma

whrg
Motivator

Perhaps you could provide more details as to what your graph looks like or you could post your search.

For testing purposes, I saved the following search as an alert:
index=_internal sourcetype=splunkd | stats count [Last 60 minutes]
I set the trigger condition "search count > 1000". The alert did trigger.

Instead of using a custom trigger condition, you could append something like "... | search value>1000" to your search. Then set the trigger condition to: Number of Results is greater than 0.

0 Karma

wjared
Explorer

Thanks again for your help, I finally got things working. Thanks to your response, I added "| stats count(eval(y-axis-param <= 2)) as failcount" at the end of my search, then in trigger condition said "search failcount > 0". Before, I didn't have search in there.

Another change I had to make was that in the search, I included both the mentioned stats condition AND the condition for charting things - "| timechart (y-axis-param)". Whether or not this should be the case, I realized that things wouldn't work if I did both at the same time, just whatever was first in the search would be executed. So I removed the charting condition from the search" and just kept the stats condition, then the alerts started working.

Thanks so much!

whrg
Motivator

Glad to hear it is working now!

I actually considered that you might have missed search in the trigger condition. (I even missed it in my reply and I had to edit it.) However, when trying to create such an alert (trigger condition: count > 1000) then I get this error message: "Cannot parse alert condition. Search Factory: Unknown search command 'count'." (I'm running Splunk 7.2.1 which is the latest version.)

0 Karma

wjared
Explorer

I believe I got the same error message, yes. As mentioned, the other issue with my search was that I put in a charting section when saving it as an alert. It seemed my search would only work for either the alert clause or charting clause, whatever came first in my search, but wouldn't do both. Thanks again for the help 🙂

0 Karma
Did you miss .conf21 Virtual?

Good news! The event's keynotes and many of its breakout sessions are now available online, and still totally FREE!