Monitoring Splunk

Alert setup best practice to avoid max concurrent searches

pdantuuri0411
Explorer

Hi, We are trying to set up around 60 alerts. Ideally, Each alert is set up to run every 3 minutes and check the data for the last 3 minutes. I am aware of the issue with concurrent searches and alerts getting skipped when there are more than 5 concurrent searches. 

What is the best way to create these alerts?

Is there a way to set up the alerts to run between minutes like below example? 

Example - 

Alert 1 - 12:00:00

Alert 2 - 12:00:05

Alert 3 - 12:00:10

Alert 4 - 12:00:15

0 Karma

anilchaithu
Builder

@pdantuuri0411 

Its possible to distribute the search jobs either by using one of the below techniques

  • schedule_priority (use this for better results)
  • search window

you have to configure it through searches, reports & alerts -> Edit -> Advanced Edit 

OR 

add below attributes to the concerned stanza in savedsearches.conf

schedule_priority = [default | higher | highest]
schedule_window = <unsigned integer> | auto

https://docs.splunk.com/Documentation/Splunk/8.0.5/Alert/AlertSchedulingBestPractices

Hope this helps

0 Karma
Get Updates on the Splunk Community!

Technical Workshop Series: Splunk Data Management and SPL2 | Register here!

Hey, Splunk Community! Ready to take your data management skills to the next level? Join us for a 3-part ...

Spotting Financial Fraud in the Haystack: A Guide to Behavioral Analytics with Splunk

In today's digital financial ecosystem, security teams face an unprecedented challenge. The sheer volume of ...

Solve Problems Faster with New, Smarter AI and Integrations in Splunk Observability

Solve Problems Faster with New, Smarter AI and Integrations in Splunk Observability As businesses scale ...