All Apps and Add-ons

JMS Messaging Modular Input Integration with Solace Systems

michaeltay
Path Finder

I am trying to poll messages from a topic using Solace via JNDI.

The following configuration works fine for Mule Endpoint

<jms:connector name="solaceToMQ" specification="1.1" username="topic1ClientID"
    validateConnections="true" persistentDelivery="true" cacheJmsSessions="true"
    eagerConsumer="true" doc:name="JMS" forceJndiDestinations="true"
    jndiDestinations="true" connectionFactoryJndiName="topic1JNDICF"
    jndiInitialFactory="com.solacesystems.jndi.SolJNDIInitialContextFactory"
    jndiProviderUrl="smf://192.168.100.53:55555">
    <spring:property name="jndiProviderProperties">
        <spring:map>
            <spring:entry key="java.naming.security.principal"
                value="default" />
            <spring:entry key="java.naming.security.credentials"
                value="987" />
        </spring:map>
    </spring:property>
</jms:connector>
...
    <jms:inbound-endpoint topic="topic1" connector-ref="solaceToMQ" doc:name="JMS"/>
    <logger message="#[message.payload.toString()]" level="INFO"
        doc:name="Logger" />
    <jms:outbound-endpoint queue="SplunkLogQueue" connector-ref="Active_MQ" doc:name="JMS"/>

Things I have done:
1) Copied all jar files used in Solace
2) Configured the following settings:

  • Topic Name: topic1
  • JMS Connection Factory JNDI Name: topic1JNDICF
  • JNDI Initial Context Factory Name: com.solacesystems.jndi.SolJNDIInitialContextFactory
  • JNDI Provider URL: smf://192.168.100.53:55555
  • JNDI Username: topic1ClientID
  • JNDI Password:
  • Topic/Queue Username: default
  • Topic Password: 987

3) Restarted Splunk service

I am getting this error from Splunk logs though:

09-14-2016 12:01:06.903 +0800 ERROR ExecProcessor - message from "python "C:\Program Files\Splunk\etc\apps\jms_ta\bin\jms.py"" Stanza jms://topic/:topic1 : Error connecting : javax.naming.NamingException: JNDI lookup failed - 503: Service Unavailable
09-14-2016 12:01:06.903 +0800 ERROR ExecProcessor - message from "python "C:\Program Files\Splunk\etc\apps\jms_ta\bin\jms.py""       at com.solacesystems.jndi.SolJNDIInitialContextFactory$SolJNDIInitialContextImpl.lookup(SolJNDIInitialContextFactory.java:220)
09-14-2016 12:01:06.903 +0800 ERROR ExecProcessor - message from "python "C:\Program Files\Splunk\etc\apps\jms_ta\bin\jms.py""       at javax.naming.InitialContext.lookup(InitialContext.java:417)
09-14-2016 12:01:06.903 +0800 ERROR ExecProcessor - message from "python "C:\Program Files\Splunk\etc\apps\jms_ta\bin\jms.py""       at javax.naming.InitialContext.lookup(InitialContext.java:417)
09-14-2016 12:01:06.903 +0800 ERROR ExecProcessor - message from "python "C:\Program Files\Splunk\etc\apps\jms_ta\bin\jms.py""       at com.splunk.modinput.jms.JMSModularInput$MessageReceiver.connect(Unknown Source)
09-14-2016 12:01:06.903 +0800 ERROR ExecProcessor - message from "python "C:\Program Files\Splunk\etc\apps\jms_ta\bin\jms.py""       at com.splunk.modinput.jms.JMSModularInput$MessageReceiver.run(Unknown Source)

Any idea what is wrong?

0 Karma
1 Solution

michaeltay
Path Finder

Turns out, I need to put in the VPN details in User defined JNDI properties.

View solution in original post

0 Karma

michaeltay
Path Finder

Turns out, I need to put in the VPN details in User defined JNDI properties.

0 Karma

spsasi
New Member

Hi,

I guess you might have resolved the issue by now. Just FYI

javax.naming.NamingException: JNDI lookup failed - 503: Service Unavailable

Which means the JNDI is not enabled in the Solace. Use SolAdmin to enable the JNDI under JMS Admin tab.

Let me know if you need more details.

0 Karma

Damien_Dallimor
Ultra Champion

JNDI lookup failed - 503: Service
Unavailable

Guessing here..... Firewall issue ?

0 Karma

michaeltay
Path Finder

I'm able to establish communication via telnet to the destination port, so I doubt it's a firewall issue.

0 Karma

Damien_Dallimor
Ultra Champion

telnet and smf are different protocols.

I would try running a Mule instance with your known working Mule configuration from the same host as your Splunk instance to try and isolate the root cause of the service unavailable exception, it sounds something very networkish.

0 Karma

michaeltay
Path Finder

The Mule instance is located on the same host as the Splunk instance.

0 Karma

Damien_Dallimor
Ultra Champion

Can you copy/paste your actual inputs.conf stanza

0 Karma
Get Updates on the Splunk Community!

.conf24 | Day 0

Hello Splunk Community! My name is Chris, and I'm based in Canberra, Australia's capital, and I travelled for ...

Enhance Security Visibility with Splunk Enterprise Security 7.1 through Threat ...

(view in My Videos)Struggling with alert fatigue, lack of context, and prioritization around security ...

Troubleshooting the OpenTelemetry Collector

  In this tech talk, you’ll learn how to troubleshoot the OpenTelemetry collector - from checking the ...