Alerting

Alert when Search runs too long

mmcve
Engager

I was wondering if there was a way to have an alert sent to an email address, after a Splunk search has been running for over a certain duration of time?

thomrs
Communicator

Something like this to get you started:

index = _* sourcetype=audittrail  info!=canceled | stats list(info) min(_time) as min max(_time) as max list(total_run_time)  list(search) by search_id user

You can adjust a bit to get your alert.

Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Character substitutions with Regular Expressions

This challenge was first posted on Slack #puzzles channelFor BORE at .conf23, we had a puzzle question which ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...

[Puzzles] Solve, Learn, Repeat: Matching cron expressions

This puzzle (first published here) is based on matching timestamps to cron expressions.All the timestamps ...