Monitoring Splunk

Alert setup best practice to avoid max concurrent searches

pdantuuri0411
Explorer

Hi, We are trying to set up around 60 alerts. Ideally, Each alert is set up to run every 3 minutes and check the data for the last 3 minutes. I am aware of the issue with concurrent searches and alerts getting skipped when there are more than 5 concurrent searches. 

What is the best way to create these alerts?

Is there a way to set up the alerts to run between minutes like below example? 

Example - 

Alert 1 - 12:00:00

Alert 2 - 12:00:05

Alert 3 - 12:00:10

Alert 4 - 12:00:15

0 Karma

anilchaithu
Builder

@pdantuuri0411 

Its possible to distribute the search jobs either by using one of the below techniques

  • schedule_priority (use this for better results)
  • search window

you have to configure it through searches, reports & alerts -> Edit -> Advanced Edit 

OR 

add below attributes to the concerned stanza in savedsearches.conf

schedule_priority = [default | higher | highest]
schedule_window = <unsigned integer> | auto

https://docs.splunk.com/Documentation/Splunk/8.0.5/Alert/AlertSchedulingBestPractices

Hope this helps

0 Karma
Get Updates on the Splunk Community!

Take Your Breath Away with Splunk Risk-Based Alerting (RBA)

WATCH NOW!The Splunk Guide to Risk-Based Alerting is here to empower your SOC like never before. Join Haylee ...

SignalFlow: What? Why? How?

What is SignalFlow? Splunk Observability Cloud’s analytics engine, SignalFlow, opens up a world of in-depth ...

Federated Search for Amazon S3 | Key Use Cases to Streamline Compliance Workflows

Modern business operations are supported by data compliance. As regulations evolve, organizations must ...