Alerting

Is it possible to have multiple alert conditions for one saved search?

shangshin
Builder

Hi,
One saved search can have only one alert condition.
I have "heartbeat" string in my log and I set up a saved search in scheduler --

 sourcetype="app.log" "Heartbeat check completed"

to trigger an alert email with one alert condition "if number of events is less than 1"

However, false alarms will occur when indexers are in the middle of restarting.

Is it possible to add an extra alert condition like scanning events are more than 10?

0 Karma

martin_mueller
SplunkTrust
SplunkTrust

Assuming a scanning event looks like "scanning event", you could do this:

sourcetype="app.log" "Heartbeat check completed" OR "scanning event" | eval category = if(searchmatch("scanning event", "scanning", "heartbeat") | chart count over host by category | where scanning > 10 AND heartbeat < 5

Alert when there is at least one event, and it'll tell you hosts that had at least ten scanning events but less than five completed heartbeat checks.

martin_mueller
SplunkTrust
SplunkTrust

Something along the lines of host=* scanning=* NOT "heartbeat check completed"? Simple boolean expressions, really.

0 Karma

shangshin
Builder

Thanks a lot for the answer.

In most cases, I will get 1 row with 3 columns - host, heartbeat and scanning. However, assuming "heartbeat check completed" is not in the log due to application error, I will get only 1 row with 2 columns -- host and scanning. How can the alert be triggered in this case?

0 Karma
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...

New in Observability Cloud - Explicit Bucket Histograms

Splunk introduces native support for histograms as a metric data type within Observability Cloud with Explicit ...