- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
Im using ver 4.1.5 of the cloud services Add-on on my HF Splunk ver 8.0.9.
I've configured an Azure App Account in the App and a input for collecting Azure Devops Audit data. But im not getting any logs in to Splunk. Im getting below warning message in "splunk_ta_microsoft_cloudservices_mscs_azure_event_hub_AzureDevopsAudit.log"
2021-09-09 08:22:45,926 level=WARNING pid=84608 tid=Thread-2 logger=uamqp.authentication.cbs_auth pos=cbs_auth.py:handle_token:122 | Authentication Put-Token failed. Retries exhausted.
CPU rises to 90% when input is enabled.
Any ideas?
Regards, Martin
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi!
Not sure if this helps you but i found a solution that works for me. We ended up sending our azure devops audit logs to a Log Analytics Workspace and from there we are exporting the data to the eventhub. The MS Add-on for Cloud Services was able to fetch the data from the eventhub.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have checked namespace on mine but seeing the same error message. This was working at one time and then it stopped
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content


I've heard of this before, and it was an issue with the "Firewalls and virtual networks" settings in the Networking section on the event hub namespace. The settings were blocking the incoming connection from the Splunk add-on. After allowing the IP address (or CIDR) of the Splunk forwarder, data started coming in.
Reference => https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-ip-filtering
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
we are doing a PoC to bring in data from Eventhub to Splunk
We are having the exact same issue on our side. we have whitelisted the splunk heavy forwarders on the Azure side. But we are using private endpoints instead of public as all our eventhubs , namespaces and data ( non-prod and prod) are in the prod zone we had to use this.
Any ideas as to what other permissions we might have to add to our application, account . Note we tried using the web socket and disabling it as well both times we get different error messages:
Error when AMQP over websocket is used:
2022-01-06 15:17:25,838 level=WARNING pid=14122 tid=Thread-2 logger=azure.eventhub._eventprocessor.event_processor pos=event_processor.py:_load_balancing:286 | EventProcessor instance 'xxxxxxxxxxxxxxxxxxxxxx' of eventhub 'xxxxxxxxxxxx-NON-PROD' consumer group 'xxxxxxxxxxxxxxxxx_group'. An error occurred while load-balancing and claiming ownership. The exception is AuthenticationError("The messaging entity 'xxxxxxxxxxxxxxxxx.windows.net/xxxxxxxxxxxxxxxxxxxxNON-PROD' could not be found. To know more visit https://aka.ms/sbResourceMgrExceptions. TrackingId:xxxxxxxxxxxxxxxxx-non-prod.servicebus.windows.net:xxxxxxxxx-NON-PROD, Timestamp:2022-01-06T20:17:30\nCBS Token authentication failed.\nStatus code: 404\nDescription: The messaging entity 'xxxxxxx.servicebus.windows.net/xxxxxxxxxxx-NON-PROD' could not be found. To know more visit https://aka.ms/sbResourceMgrExceptions. TrackingId:xxxxxxxxxxx, SystemTracker:exxxxxxx-non-prod.servicebus.windows.net:xxxx-NON-PROD, Timestamp:2022-01-06T20:17:30"). Retrying after 11.688764392967611 seconds
2022-01-06 15:17:19,420 level=WARNING pid=14122 tid=Thread-2 logger=uamqp.authentication.cbs_auth pos=cbs_auth.py:handle_token:119 | Authentication Put-Token failed. Retries exhausted.
Error when AMQP over websocket is disabled:
2022-01-06 15:16:31,386 level=WARNING pid=3860 tid=Thread-2 logger=azure.eventhub._eventprocessor.event_processor pos=event_processor.py:_load_balancing:286 | EventProcessor instance 'xxxxxxx' of eventhub 'xxxxxxxxxxxx01-NON-PROD' consumer group 'preview_data_consumer_group'. An error occurred while load-balancing and claiming ownership. The exception is ConnectError('Failed to open mgmt link: MgmtOpenStatus.Error\nFailed to open mgmt link: MgmtOpenStatus.Error'). Retrying after 10.389673444151345 seconds
2022-01-06 15:16:06,834 level=WARNING pid=3860 tid=Thread-2 logger=azure.eventhub._eventprocessor.event_processor pos=event_processor.py:_load_balancing:286 | EventProcessor instance 'xxxxxxxxxxxxxxx' of eventhub 'xxxxxxxxxx01-NON-PROD' consumer group 'preview_data_consumer_group'. An error occurred while load-balancing and claiming ownership. The exception is ConnectError('Failed to open mgmt link: MgmtOpenStatus.Error\nFailed to open mgmt link: MgmtOpenStatus.Error'). Retrying after 11.352936212034383 seconds
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks!
Firewall/Network settings looks fine. However im seeing a lot of the error below:
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Any suggestion here.
We are also getting the same error.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi!
Not sure if this helps you but i found a solution that works for me. We ended up sending our azure devops audit logs to a Log Analytics Workspace and from there we are exporting the data to the eventhub. The MS Add-on for Cloud Services was able to fetch the data from the eventhub.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks That seems to be our problem as well.
We fixed it and now the data is flowing.
