Hi Team,
How do we pass link result of splunk alerts in slack from splunk?
Use this app and it's instructions:
See if these help:
https://docs.splunk.com/Documentation/Splunk/8.0.3/Alert/AlertSchedulingBestPractices
https://www.splunk.com/en_us/blog/tips-and-tricks/schedule-windows-vs-skewing.html
I believe you may be experiencing schedule skewing.
Use this app and it's instructions:
Hi All,
I have configured Slack and email for the same alert, however we are not getting the same number of alerts in slack, alert has been scheduled to run every 2 minutes. Email alerts are coming properly but not in Slack. What could be an issue?
And there is also a delay in alerts that are coming on Slack. Quick help would be appreciated.
Are you sure you are not hitting Slack's API rate limit?
Limits are applied to workspaces, so if you have several integrations (in yours and other channels) you could find that slack is limiting the number of messages delivered.
https://api.slack.com/docs/rate-limits
@nickhillscpl
How do we confirm whether rate limit is been applied to specific channel or not?
By running the curl command....
How do we do that?
I have shared the curl command above. Here is the doc I got it from:
https://api.slack.com/tutorials/slack-apps-hello-world
Please work with your server admin team if you don't know how to do this.
jkat54 ♦ manish_singh_777 · Jan 27 at 12:54 PM
curl -X POST -H 'Content-type: application/json' --data '{"text":"Allow me to reintroduce myself!"}' YOUR_WEBHOOK_URL
@jkat54
Before I go ahead and accept this answer, need to know few things.
I am encountering the same issue again, alerts are coming to slack channel but there is a delay of 30 mins. For instance, if alert has been scheduled to run every 5 mins, In 30 mins I am getting only 2 alerts. Is it because of network issue?
Check
index=_internal sourcetype=scheduler "your saved search name"
See if there are any issues where the search is not returning expected results.
@jkat54
Do you think it is a connectivity issue?
Splunk likes to skip searches when it's busy. That's my best guess.
@jkat54
I don't see error in the scheduler logs. Do you want me to search anything specific?
@jkat54
One more thing, I would like to highlight here is that, webhook created by me is now not visible under "Manage Webhook" page but still alerts are coming into splunk with the delay of 60 mins - 90 mins.
@jkat54
I have also observed that we are getting error code 255 only during the time interval when alerts were not received in Slack channel from Splunk.
Have you tried using curl from the server to post to slack?
Maybe there is packet loss or very slow connection to slack.
Or have you tried looking at the scheduler logs?
index=_internal sourcetype=scheduler
Have you tried the job inspector to see if search.log has any errors when the alertS run?
@jkat54
We are getting alerts in slack but there is a delay in time, suppose I am scheculing an alert for 2 minutes for email and slack, in next 6 minutes I get 3 email alerts but only 2 slack alerts.
And in splunkd.log I see this error message "Alert action script returned error code=255". What could be an issue?
Sounds like connectivity issues between your search head(s) and slack rest endpoints.
Have you tried using curl from the server to post to slack?
@jkat54
How do we do that?
Which command to run and from where?