Splunk AppDynamics

How do I prevent SocketTimeoutException exceptions from triggering Actions?

CommunityUser
Splunk Employee
Splunk Employee

I'm trying to report errors and exceptions that happen in a test environment into a Slack channel.  Everytime new code is deployed, every instance triggers an error like the following:

Count	Event Type
1	APPLICATION_ERROR
ERROR	Application Error
Thu Aug 31 14:27:51 PDT 2017 | app17 | app17_app17-test-app-1.corp.company.net

SocketTimeoutException: connect timed out

I have tried everything I can think of to prevent these exceptions from triggering Actions.

image.pngimage.pngimage.png

Does this configuration not apply to the Alert & Respond policies?  My controller version is 4.2.15.3, build 41.

Labels (3)
0 Karma

CommunityUser
Splunk Employee
Splunk Employee

I have had some success with integrating with slack. I did this using a slack webhook which was nothing more than just an app I created and enabled the webhook for. You mainly need this to provide the URL with a token to be used by AppDynamics for posting the events. In AppDynamics just create an html request template with the following settings:

Request URL method: post

Raw URL: (insert URL for the webhook for your slack app here)

Payload mime type: application/json

Here is the json I use, it creates a message with attachments to organize the content, colorizes based on event level (critical, warn, info), and normalizes the html tags in the messages to markdown which slack can display for rich outout.

#set( $em = ${latestEvent.eventMessage} )
#set( $em = $em.replace("<b>", "*") )
#set( $em = $em.replace("</b>", "*") )
#set( $em = $em.replace("<br>", "\n") )

#if ( ${latestEvent.tier.name} )#set ( $tier-name = ${latestEvent.tier.name} )#else#set ( $tier-name = "n/a" )#end
#if ( ${latestEvent.node.name} )#set ( $node-name = ${latestEvent.node.name} )#else#set ( $node-name = "n/a" )#end

#if ( ${topSeverity} == "ERROR" )#set ( $severity = "danger" )#end
#if ( ${topSeverity} == "WARN" )#set ( $severity = "warning" )#end
#if ( ${topSeverity} == "INFO" )#set ( $severity = "good" )#end

{
    "attachments": [
        {
            "fallback": "AppDynamics Alert",
            "color": "$severity",
            "pretext": "AppDynamics ${topSeverity} Alert",
            "title": "${latestEvent.deepLink}",
            "title_link": "${latestEvent.deepLink}",
            "text":"${latestEvent.healthRule.name}",
            "mrkdwn_in": ["fields"],
            "fields": [
                {
                    "title": "Severity",
                    "value": "${topSeverity}",
                    "short": true
                },
                {
                    "title": "Application",
                    "value": "${latestEvent.application.name}",
                    "short": true
                },
                {
                    "title": "Tier",
                    "value": "$tier-name",
                    "short": true
                },
                {
                    "title": "Node",
                    "value": "$node-name",
                    "short": true
                },
                {
                    "title": "Message",
                    "value": "$em",
                    "short": false
                }
            ],
            "image_url": "latestEvent.severityImage.deepLink"
        }
    ]
}

I hope this helps you.

Get Updates on the Splunk Community!

Splunk Answers Content Calendar, June Edition

Get ready for this week’s post dedicated to Splunk Dashboards! We're celebrating the power of community by ...

What You Read The Most: Splunk Lantern’s Most Popular Articles!

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

See your relevant APM services, dashboards, and alerts in one place with the updated ...

As a Splunk Observability user, you have a lot of data you have to manage, prioritize, and troubleshoot on a ...