Splunk Search

index and alert on any host by total count

New Member

index and alert if any host over the last 30 minutes sees more than 1k of messages.

How do I do this?

Tags (3)
0 Karma


If your interested in evaluating longer periods of time (or multiple hosts) you may want to check out the metrics logs created by Splunk:

index=_internal source=*metrics* group=per_host_thruput | stats sum(ev) as count by series | search count>500 

You can also specify time constraints in the scheduled search instead of the search syntax itself.

0 Karma

Splunk Employee
Splunk Employee

To get your data indexed in the first place, you can follow the numerous guides available. To alert if any host sees more than 1k messages, I'd do the following:

YourSearch earliest=-30m | stats count by host | where count > 500

You can then schedule that search, and alert if the count > 0

Further Reading: