Alerting

Slack Notification alert failing with Error code 1

vn_g
Path Finder

 

 

 

01-24-2024 10:24:31.312 +0000 WARN  sendmodalert [3050674 AlertNotifierWorker-0] - action=slack - Alert action script returned error code=1
01-24-2024 10:24:31.312 +0000 INFO  sendmodalert [3050674 AlertNotifierWorker-0] - action=slack - Alert action script completed in duration=96 ms with exit code=1
01-24-2024 10:24:31.304 +0000 FATAL sendmodalert [3050674 AlertNotifierWorker-0] - action=slack STDERR -  Alert action failed
01-24-2024 10:24:31.304 +0000 INFO  sendmodalert [3050674 AlertNotifierWorker-0] - action=slack STDERR -  Slack API responded with HTTP status=200
01-24-2024 10:24:31.304 +0000 INFO  sendmodalert [3050674 AlertNotifierWorker-0] - action=slack STDERR -  Using configured Slack App OAuth token: xoxb-XXXXXXXX
01-24-2024 10:24:31.304 +0000 INFO  sendmodalert [3050674 AlertNotifierWorker-0] - action=slack STDERR -  Running python 3
01-24-2024 10:24:31.212 +0000 INFO  sendmodalert [3050674 AlertNotifierWorker-0] - Invoking modular alert action=slack for search="Updated Testing Nagasri Alert" sid="scheduler_xxxxx__RMDxxxxxxx" in 
app="xxxxx" owner="xxxx" type="saved"

 

 

 

I have done the entire setup correctly , created an app with chat:write scope and added the channel to the app. got the oauth token and the webhook link of the channel. But the sendalert is failing with error code 1. And the git "slack-alerts/src/app/README.md at main · splunk/slack-alerts (github.com)" , doesnt mention about it.  Is it an issue from Splunk end or Slack end? What would be the fix for it?

 

Labels (1)
Tags (1)
0 Karma

datadevops
Path Finder

Hi there,

Understanding the Error:

  • Error code 1 indicates a general failure in the alert action script, but doesn't pinpoint the exact cause.
  • The logs show a successful API response from Slack (HTTP status 200), suggesting the issue likely lies within Splunk's configuration or script execution.

Troubleshooting Steps:

  1. Double-Check Configuration:

    • Meticulously verify your Slack app setup, OAuth token, webhook URL, and Splunk alert action configuration for any typos or inconsistencies.
    • Ensure the app has the necessary chat:write scope and permissions for the intended channel.
  2. Examine Script Logs:

    • Scrutinize the sendmodalert logs for more detailed error messages that could guide you towards the root cause.
  3. Review Alert Action Script:

    • If you're using a custom script, inspect the code for potential errors or conflicts.
    • Verify that the script correctly handles Slack API responses and potential exceptions.
  4. Upgrade Splunk and Apps:

    • Utilize the latest versions of Splunk and the Slack app to benefit from bug fixes and improvements.
  5. Consult Splunk Documentation and Community:

    • Refer to Splunk's official documentation and community forums for known issues, workarounds, and best practices related to Slack integration.
  6. Engage Splunk Support:

    • If the issue persists, reach out to Splunk support for more in-depth assistance.

Additional Tips:

  • Test your Slack integration independently of Splunk's alert system to isolate potential problems.
  • Consider using a network monitoring tool to capture detailed traffic between Splunk and Slack for further analysis.

~ If the reply helps, a Karma upvote would be appreciated

 

0 Karma
Get Updates on the Splunk Community!

How to Monitor Google Kubernetes Engine (GKE)

We’ve looked at how to integrate Kubernetes environments with Splunk Observability Cloud, but what about ...

Index This | How can you make 45 using only 4?

October 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...

Splunk Education Goes to Washington | Splunk GovSummit 2024

If you’re in the Washington, D.C. area, this is your opportunity to take your career and Splunk skills to the ...