Alerting

Splunk custom alerts are not working ?

btshivanand
Path Finder

We came to know that splunk custom alerts are not working after the upgradation to 8.0.1v.also receiving searches delayed error on the cluster..can any one help with this ?

Labels (3)
0 Karma

richgalloway
SplunkTrust
SplunkTrust

Search problems could cause alert problems since alerts are just searches with an action, but we'll need more information.

What is the exact text of the error message(s) you receive?

Did anything else change besides the version of Splunk?

What is you Splunk environment like (single instance, cluster, etc.)?

When you say the custom alerts are "not working" what exactly do you mean?  What are the symptoms?  Are they running at all? Have you checked the search log for the alerts?

---
If this reply helps you, Karma would be appreciated.
0 Karma

btshivanand
Path Finder

What is the exact text of the error message(s) you receive?

I see below error in search head captain and status show as red.

Root Cause(s):

The percentage of high priority searches delayed (11%) over the last 24 hours is very high and exceeded the red thresholds (10%) on this Splunk instance. Total Searches that were part of this percentage=100. Total delayed Searches=11

Last 50 related messages:

 

Did anything else change besides the version of Splunk?

We have not changed anything after version upgrade.

What is you Splunk environment like (single instance, cluster, etc.)?

Our environment is in cluster..4 indexer 3 search head 1 deployer 1 matster

When you say the custom alerts are "not working" what exactly do you mean?  What are the symptoms?  Are they running at all? Have you checked the search log for the alerts?

Custom alert conditions are not working. I have investigated scheduler log. I could see that status is success alert action is blank.one user raised the concern that alert is not triggering when the custom condition are met. Later we realized that its not working with all the alerts which are using custom condition.

10-31-2020 08:10:07.566 +0000 INFO SavedSplunker - savedsearch_id="XXX;search; alert", search_type="", user="XXX", app="search", savedsearch_name="XXXX alert", priority=default, status=success, digest_mode=1, scheduled_time=1604131800, window_time=0, dispatch_time=1604131805, run_time=1.785, result_count=1015, alert_actions="", sid="scheduler__smadan__search__RMD5ab6a869ca92dbacc_at_1604131800_63960_638683B3-25D9-4D2A-AF2E-4E43362FDBFA", suppressed=0, thread_id="AlertNotifierWorker-0", workload_pool=""

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Keep the Learning Going with the New Best of .conf Hub

Hello Splunkers, With .conf26 getting closer, there’s already a lot of excitement building around this year’s ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...

How to find the worst searches in your Splunk environment and how to fix them

Everyone knows Splunk is a powerful platform for running searches and doing data analytics. Your ...