Reporting

Fix alert skipping due to reaching "maximum number of concurrent running jobs"

bruceclarke
Contributor

I'm trying to debug why a saved search alert we have started skipping recently. Splunk says that there is another instance of the alert running, and it cannot kick off a second run of the search until that original run finishes. Or, more specifically, the error says: "The maximum number of concurrent running jobs for this historical scheduled search on this instance has been reached".

The documentation (see below) says to either increase the quota for a saved search or delete the job from the job queue. Unfortunately, neither works for me. I don't want to increase the quota for saved searches (I only want one instance of the search to be running at once), and I don't see any instance of the search running in the job queue, so I can't kill it.

I want to kill the search that is running so that future alerts can run without hitting this error. Does anyone know why it might not be showing up in the job queue? What steps can I take to fix? Thanks!

I've already reviewed the following documentation:
* https://answers.splunk.com/answers/54674/how-to-increase-the-maximum-number-of-concurrent-historical...
* http://wiki.splunk.com/Community:TroubleshootingSearchQuotas

0 Karma

pradeepkumarg
Influencer

Why do you want to kill the search? Isn't the purpose defeated if you are going to make a way to run a new search only to kill it later?
Your better options are spread the frequency so that they don't step over each other.
Increase the max concurrency for that particular search, so that multiple instances of the same search can run in parallel.

0 Karma

bruceclarke
Contributor

To set context, this search has been running for months with no problem. The issue here seems to be that something internally in Splunk is off. There is no instance of the search currently running (at least nothing I see in the job queue), yet we've been seeing this internal log saying that the alert has been skipping for over a week now.

In short, I don't believe this is a problem with the search. I believe it to be a problem with something internally with Splunk. We somehow got in a weird state and I'm trying to understand how that happened.

0 Karma

pradeepkumarg
Influencer

Ok, check the run_time of this search over the time. It could very well happen that the search is now taking a longer time to finish than before. This could happen if you have more data to search now than before or for some other reasons that is degrading the search performance.

0 Karma

bruceclarke
Contributor

@gpradeepkumarreddy - I really appreciate the help here. Nothing in the internal logs show that the search is running. However, I am still seeing the alert be skipped due to "The maximum number of concurrent running jobs for this historical scheduled search on this instance has been reached".

Over time, the search still runs well within the time range allotted to it. In fact, I was able to copy the alert and the copied version runs fine.

Given all of this information, I am confident that this does not have to do with the runtime of the search. Instead, I believe that there is something internally inconsistent with our Splunk infrastructure. This is causing Splunk to believe there is already an instance of the alert running, when there is not any instance running. So, I'm wondering if there is something else I can do to "kill" whatever artifact of the search exists that makes Splunk believe there is already an instance of the search running.

In short, I am confident this is not an issue with the alert itself. I believe this is a problem with Splunk's determination that the alert is already running.

0 Karma
Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...