Alert based on number of events after charting the results


I am a newbie on Splunk and have been trying to create an alert using the following search:

index=prodenvs source=prodlogs "transaction timeout"

I had setup it to cron every 5 mins and check past 5 mins and alert if the count goes more than 10. The alert was working fine but I wanted to summarize the results based on hosts.

index=prodenvs source=prodlogs "transaction timeout" | chart count by host

The alert stopped working after I changed it and I noticed that the alert criteria is taking number of hosts as the "count" now. Can I still use the alert feature based on the number of events and yet be able to chart my results? Any help is greatly appreciated.

Tags (3)
0 Karma


You can set this as a custom alert condition:

stats sum(count) as sum | where sum > 10
0 Karma


Now you can set the alert using sum(count)

we will be able to provide clear answer if you can give us some more details

0 Karma


Where did you set the alerts? Is it on splunk web OR some custom script OR ?
Can you provide some more details

0 Karma
Get Updates on the Splunk Community!

.conf24 | Day 0

Hello Splunk Community! My name is Chris, and I'm based in Canberra, Australia's capital, and I travelled for ...

Enhance Security Visibility with Splunk Enterprise Security 7.1 through Threat ...

(view in My Videos)Struggling with alert fatigue, lack of context, and prioritization around security ...

Troubleshooting the OpenTelemetry Collector

  In this tech talk, you’ll learn how to troubleshoot the OpenTelemetry collector - from checking the ...