Reporting

Max time spent in per-result alerts issue

mdsnmss
SplunkTrust
SplunkTrust

I'm running into an issue with a scheduled search not completing all of it's alert actions when it executes. The scheduled search has been set to run every minute and the warning will occur approximately every minute. We reduced the schedule to every 5 min and get the warning approximately every 5 minutes. This is the warning:

04-27-2018 10:40:25.092 -0400 WARN SavedSplunker - Reached maximum amount of time allowed to spend in per-result alerts for savedsearch_id="nobody;app;search", sid="scheduler__user_Y3liZXJdrchyZWF0__RMD512364230b266f208_at_1524839700_11453_84568E04-6E8F-4D6B-7HF0C-1CD3DC88879B", max_alerts_time=300 (seconds), fired_alerts=172

Regardless of the schedule the fired_alerts typically falls in the 150-190 range. In alert_actions.conf I did find the setting maxtime = 5m as a default. I changed that to maxtime = 10m. A btool shows that setting as applying but the warning still reflects a 300s time limit. Is there another setting to look at? I've dug through limits.conf, alert_actions.conf, and saved_searches.conf and didn't see anything but perhaps I missed something.

0 Karma
1 Solution

mdsnmss
SplunkTrust
SplunkTrust

This allowed us to removed some caps in the limits.conf:

[scheduler]
action_execution_threads = 10
actions_queue_size = 10000
max_per_result_alerts =10000
max_per_result_alerts_time = 600

View solution in original post

mdsnmss
SplunkTrust
SplunkTrust

This allowed us to removed some caps in the limits.conf:

[scheduler]
action_execution_threads = 10
actions_queue_size = 10000
max_per_result_alerts =10000
max_per_result_alerts_time = 600
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Splunk AI Assistant for SPL vs. ChatGPT: Which One is Better?

In the age of AI, every tool promises to make our lives easier. From summarizing content to writing code, ...

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Thanks for the Memories! Splunk University, .conf25, and our Community

Thank you to everyone in the Splunk Community who joined us for .conf25, which kicked off with our iconic ...