Splunk Observability Cloud

Detector for alert storm / detector on detector

MichaelRTR
Loves-to-Learn

Hi there,

I have a use-case whereby I want to trigger an alert/detector when there is a spike of triggered detectors in my org. The idea is to catch an outage and alert an appropriate team.

I want to monitor and alert when there is a spike in triggered detectors.
Is there a suitable way to achieve this?
I tested but I do not see any suitable metrics to use.
This seems like a simple use-case but there does not appear to be a simple solution.


As an example, I would like to trigger a detector when >5 P1 detectors have triggered in a 5 min period (just an example).

Thanks in advance.

Labels (1)
0 Karma

bishida
Splunk Employee
Splunk Employee

I would recommend sending the alerts to the Splunk platform (Enterprise or Cloud) and performing event correlation there. This is a core use case for IT Service Intelligence. But if you wanted to just address this one example, you could do it pretty easily with a one-off search/alert in the platform. 

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Matching cron expressions

This puzzle (first published here) is based on matching timestamps to cron expressions.All the timestamps ...

Design, Compete, Win: Submit Your Best Splunk Dashboards for a .conf26 Pass

Hello Splunkers,  We’re excited to kick off a Splunk Dashboard contest! We know that dashboards are a primary ...

May 2026 Splunk Expert Sessions: Security & Observability

Level Up Your Operations: May 2026 Splunk Expert Sessions Whether you are refining your security posture or ...