All Apps and Add-ons
Highlighted

Talk to Splunk with Amazon Alexa: Why can't I get Python Service to listen on port 443?

Communicator

Hi there, it seems when I try to run the Talk to Splunk with Amazon Alexa app, create a modular input, it starts to listen on 443 (I can see this from netstat), but then stops listening after a few seconds and logs errors (below). The Splunk server is called "splunklin" and I use dynamic DNS to direct external traffic to myserver.ddns.net (changed for anonymity). The firewall NATs 443 to internal 443 on my Splunk server by IP. Name resolution might have been an issue, but I believe i resolved this by editing /etc/hosts (it's Centos) with the following line:

192.168.2.71 myserver.ddns.net splunklin

So it can actually resolve either name to the internal IP address (pinging either one works). The cert stuff is set to myserver.ddns.net. The modular input sets the endpoint to myserver.ddns.net/alexa. Right after I hit save netstat shows listening for about 10 seconds, and then craps out. Here's the SplunkD log I get:

10-27-2016 17:35:52.069 -0400 ERROR ModularInputs - <stderr> Argument validation for scheme=alexa:  Can't determine timestampTolerance value, will revert to default value.

10-27-2016 17:35:52.155 -0400 INFO  ExecProcessor - Removing status item "/opt/splunk/etc/apps/alexa/bin/alexa.py (isModInput=yes)

10-27-2016 17:35:52.157 -0400 INFO  ExecProcessor - New scheduled exec process: python /opt/splunk/etc/apps/alexa/bin/alexa.py

10-27-2016 17:35:52.840 -0400 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/alexa/bin/alexa.py" Probing socket connection to SplunkD failed.Either SplunkD has exited ,or if not,  check that your DNS configuration is resolving your system's hostname (splunklin) correctly : Connection refused

10-27-2016 17:35:53.474 -0400 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/alexa/bin/alexa.py" SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".

10-27-2016 17:35:53.474 -0400 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/alexa/bin/alexa.py" SLF4J: Defaulting to no-operation (NOP) logger implementation

10-27-2016 17:35:53.474 -0400 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/alexa/bin/alexa.py" SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.

10-27-2016 17:36:02.841 -0400 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/alexa/bin/alexa.py" Probing socket connection to SplunkD failed.Either SplunkD has exited ,or if not,  check that your DNS configuration is resolving your system's hostname (splunklin) correctly : Connection refused

10-27-2016 17:36:12.842 -0400 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/alexa/bin/alexa.py" Probing socket connection to SplunkD failed.Either SplunkD has exited ,or if not,  check that your DNS configuration is resolving your system's hostname (splunklin) correctly : Connection refused

10-27-2016 17:36:12.842 -0400 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/alexa/bin/alexa.py" Determined that Splunk has probably exited, HARI KARI.

Any ideas? Thanks!

0 Karma
Highlighted

Re: Talk to Splunk with Amazon Alexa: Why can't I get Python Service to listen on port 443?

Ultra Champion

Stop for a second and climb out of that rabbit hole.

What you are seeing is something in the Modular Input framework subsytem , not related to the Alexa App Code.
The framework manages it's own lifecycle by doing socket pings back to SplunkD to check that the SplunkD process is still running.Amongst other things to prevent "zombie" processes after SplunkD restarts etc...

Try a /etc/hosts entry to resolve splunklin to 127.0.0.1

View solution in original post

Highlighted

Re: Talk to Splunk with Amazon Alexa: Why can't I get Python Service to listen on port 443?

Communicator

Wow... that was WAY too easy. Thank you!

I've got other modular alerts that work fine despite the poor job I did setting up name resolution on the server. Maybe they don't invoke the modular input framework subsystem properly.

In any case, problem solved, thank you!

0 Karma
Speak Up for Splunk Careers!

We want to better understand the impact Splunk experience and expertise has has on individuals' careers, and help highlight the growing demand for Splunk skills.