Monitoring Splunk

Alert setup best practice to avoid max concurrent searches

pdantuuri0411
Explorer

Hi, We are trying to set up around 60 alerts. Ideally, Each alert is set up to run every 3 minutes and check the data for the last 3 minutes. I am aware of the issue with concurrent searches and alerts getting skipped when there are more than 5 concurrent searches. 

What is the best way to create these alerts?

Is there a way to set up the alerts to run between minutes like below example? 

Example - 

Alert 1 - 12:00:00

Alert 2 - 12:00:05

Alert 3 - 12:00:10

Alert 4 - 12:00:15

0 Karma

anilchaithu
Builder

@pdantuuri0411 

Its possible to distribute the search jobs either by using one of the below techniques

  • schedule_priority (use this for better results)
  • search window

you have to configure it through searches, reports & alerts -> Edit -> Advanced Edit 

OR 

add below attributes to the concerned stanza in savedsearches.conf

schedule_priority = [default | higher | highest]
schedule_window = <unsigned integer> | auto

https://docs.splunk.com/Documentation/Splunk/8.0.5/Alert/AlertSchedulingBestPractices

Hope this helps

0 Karma
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...

New in Observability Cloud - Explicit Bucket Histograms

Splunk introduces native support for histograms as a metric data type within Observability Cloud with Explicit ...