All Apps and Add-ons

ConnectError: Unable to open management session. Please confirm URI namespace exists.

rayar
Path Finder

Hi

We installed  Microsoft Azure Add-on for Splunk (https://splunkbase.splunk.com/app/3757/ version 2.1.0)

is there any doc we can use for Azure side configuration   ?


Now for Splunk we  are getting the following error


is there any config doc for the configuration that should be done from Azure side ?
2020-06-24 16:26:32,680 ERROR pid=21096 tid=MainThread file=base_modinput.py:log_error:307 | Get error when collecting events.
Traceback (most recent call last):
File "/opt/splunk/etc/apps/TA-MS-AAD/bin/ta_ms_aad/modinput_wrapper/base_modinput.py", line 127, in stream_events
self.collect_events(ew)
File "/opt/splunk/etc/apps/TA-MS-AAD/bin/azure_event_hub.py", line 92, in collect_events
input_module.collect_events(self, ew)
File "/opt/splunk/etc/apps/TA-MS-AAD/bin/input_module_azure_event_hub.py", line 113, in collect_events
partition_ids = client.get_partition_ids()
File "/opt/splunk/etc/apps/TA-MS-AAD/bin/azure/eventhub/client.py", line 163, in get_partition_ids
return self.get_properties()['partition_ids']
File "/opt/splunk/etc/apps/TA-MS-AAD/bin/azure/eventhub/client.py", line 146, in get_properties
response = self._management_request(mgmt_msg, op_type=b'com.microsoft:eventhub')
File "/opt/splunk/etc/apps/TA-MS-AAD/bin/azure/eventhub/client.py", line 127, in _management_request
self._handle_exception(exception, retry_count, max_retries)
File "/opt/splunk/etc/apps/TA-MS-AAD/bin/azure/eventhub/client.py", line 105, in _handle_exception
_handle_exception(exception, retry_count, max_retries, self)
File "/opt/splunk/etc/apps/TA-MS-AAD/bin/azure/eventhub/error.py", line 196, in _handle_exception
raise error
ConnectError: Unable to open management session. Please confirm URI namespace exists.
Unable to open management session. Please confirm URI namespace exists.
Collapse

.

.

.

.
2020-06-28 16:36:31,154 INFO pid=19209 tid=MainThread file=mgmt_operation.py:__init__:65 | 'Failure: getaddrinfo failure 66.' ('/data/src/vendor/azure-uamqp-c/deps/azure-c-shared-utility/adapters/socketio_berkeley.c':'lookup_address_and_initiate_socket_connection':283)
2020-06-28 16:36:31,155 INFO pid=19209 tid=MainThread file=mgmt_operation.py:__init__:65 | 'lookup_address_and_connect_socket failed' ('/data/src/vendor/azure-uamqp-c/deps/azure-c-shared-utility/adapters/socketio_berkeley.c':'socketio_open':766)
2020-06-28 16:36:31,155 INFO pid=19209 tid=MainThread file=mgmt_operation.py:__init__:65 | 'Closing tlsio from a state other than TLSIO_STATE_EXT_OPEN or TLSIO_STATE_EXT_ERROR'
2020-06-28 16:36:31,155 INFO pid=19209 tid=MainThread file=mgmt_operation.py:__init__:65 | 'Invalid tlsio_state. Expected state is TLSIO_STATE_OPENING_UNDERLYING_IO.' ('/data/src/vendor/azure-uamqp-c/deps/azure-c-shared-utility/adapters/tlsio_openssl.c':'on_underlying_io_open_complete':760)
2020-06-28 16:36:31,156 INFO pid=19209 tid=MainThread file=mgmt_operation.py:__init__:65 | 'Failed opening the underlying I/O.' ('/data/src/vendor/azure-uamqp-c/deps/azure-c-shared-utility/adapters/tlsio_openssl.c':'tlsio_openssl_open':1258)
2020-06-28 16:36:31,156 INFO pid=19209 tid=MainThread file=mgmt_operation.py:__init__:65 | 'xio_open failed' ('/data/src/vendor/azure-uamqp-c/src/saslclientio.c':'saslclientio_open_async':1097)
2020-06-28 16:36:31,156 INFO pid=19209 tid=MainThread file=mgmt_operation.py:__init__:65 | 'Opening the underlying IO failed' ('/data/src/vendor/azure-uamqp-c/src/connection.c':'connection_open':1344)
2020-06-28 16:36:31,156 INFO pid=19209 tid=MainThread file=connection.py:_state_changed:177 | Connection 'e3accde9-ddf9-48cc-8b99-b7b60b83596b' state changed from <ConnectionState.START: 0> to <ConnectionState.END: 13>
2020-06-28 16:36:31,156 INFO pid=19209 tid=MainThread file=connection.py:_state_changed:181 | Connection with ID 'e3accde9-ddf9-48cc-8b99-b7b60b83596b' unexpectedly in an error state. Closing: False, Error: None
2020-06-28 16:36:31,156 INFO pid=19209 tid=MainThread file=mgmt_operation.py:__init__:65 | 'Begin session failed' ('/data/src/vendor/azure-uamqp-c/src/link.c':'link_attach':1154)
2020-06-28 16:36:31,156 INFO pid=19209 tid=MainThread file=mgmt_operation.py:__init__:65 | 'Link attach failed' ('/data/src/vendor/azure-uamqp-c/src/message_receiver.c':'messagereceiver_open':362)
2020-06-28 16:36:31,156 DEBUG pid=19209 tid=MainThread file=mgmt_operation.py:__init__:65 | Management link open: 1
2020-06-28 16:36:31,157 INFO pid=19209 tid=MainThread file=mgmt_operation.py:__init__:65 | 'Failed opening message receiver' ('/data/src/vendor/azure-uamqp-c/src/amqp_management.c':'amqp_management_open_async':981)
2020-06-28 16:36:31,958 INFO pid=19209 tid=MainThread file=error.py:_handle_exception:233 | u'eventhub.pysdk-3ff58abd' has an exception (AMQPConnectionError('Unable to open management session. Please confirm URI namespace exists.',)). Retrying...
2020-06-28 16:36:31,958 DEBUG pid=19209 tid=MainThread file=client.py:close:295 | Closing non-CBS session.

 

Labels (1)
0 Karma

staten
Observer

Is the internal firewall configured to allow TCP ports 5671 and 5672 from the Splunk server out to the the event hub (<hubname>.servicesbus.windows.net) ?

Ref: https://community.splunk.com/t5/All-Apps-and-Add-ons/Microsoft-Azure-Add-on-Error-setting-pulling-Ev...

0 Karma

rayar
Path Finder

Thanks 

Our access to Internet is via Proxy (our ports are 8080 and 443 )

Is there any other option ?

0 Karma

staten
Observer

Do you have the proxy configured in the Microsoft Azure Add-on for Splunk? Look at the Configuration -> Proxy tab in the add-on web UI.

If not, then configure the add-on to use the Proxy.

If the Proxy is already configured in the add-on, then contact the folks who manage your organization's proxy and explain to them that you're looking to support additional outbound ports.

0 Karma

rayar
Path Finder

We are using Private End Point so Proxy is not required 

Anyway I have another Event Hub on the same range working properly with Splunk

actually the Event Hub I am trying to connect is a copy of the working Event Hub

any idea what might be the issue  ? 

0 Karma
.conf21 CFS Extended through 5/20!

Don't miss your chance
to share your Splunk
wisdom in-person or
virtually at .conf21!

Call for Speakers has
been extended through
Thursday, 5/20!