Alerting

Alert when Search runs too long

mmcve
Engager

I was wondering if there was a way to have an alert sent to an email address, after a Splunk search has been running for over a certain duration of time?

thomrs
Communicator

Something like this to get you started:

index = _* sourcetype=audittrail  info!=canceled | stats list(info) min(_time) as min max(_time) as max list(total_run_time)  list(search) by search_id user

You can adjust a bit to get your alert.

Get Updates on the Splunk Community!

.conf24 | Day 0

Hello Splunk Community! My name is Chris, and I'm based in Canberra, Australia's capital, and I travelled for ...

Enhance Security Visibility with Splunk Enterprise Security 7.1 through Threat ...

(view in My Videos)Struggling with alert fatigue, lack of context, and prioritization around security ...

Troubleshooting the OpenTelemetry Collector

  In this tech talk, you’ll learn how to troubleshoot the OpenTelemetry collector - from checking the ...