Getting Data In

Splunk Add-on for Microsoft cloud services

Knightrider1234
Engager

Guys, could you please shed some light here?

I have configured azure api management to stream events to event hub and Spunk add-on to connect to event hub. I am receiving the below error.

I have given the Azure Event hub data receiver role to the IAM account used in the integratiion.

2021-05-31 14:58:33,763 level=ERROR pid=9469 tid=MainThread logger=__main__ pos=utils.py:wrapper:72 | datainput="myapim" start_time=1622473113 | message="Data input was interrupted by an unhandled exception." Traceback (most recent call last): File "/opt/splunk/etc/apps/Splunk_TA_microsoft-cloudservices/lib/splunksdc/utils.py", line 70, in wrapper return func(*args, **kwargs) File "/opt/splunk/etc/apps/Splunk_TA_microsoft-cloudservices/bin/mscs_azure_event_hub.py", line 636, in run consumer = self._create_event_hub_consumer(workspace, credential, proxy) File "/opt/splunk/etc/apps/Splunk_TA_microsoft-cloudservices/bin/mscs_azure_event_hub.py", line 592, in _create_event_hub_consumer args.consumer_group, File "/opt/splunk/etc/apps/Splunk_TA_microsoft-cloudservices/bin/mscs_azure_event_hub.py", line 215, in open checkpoint = SharedLocalCheckpoint(fullname) File "/opt/splunk/etc/apps/Splunk_TA_microsoft-cloudservices/bin/mscs_azure_event_hub.py", line 87, in __init__ self._fd = os.open(fullname, os.O_RDWR | os.O_CREAT) FileNotFoundError: [Errno 2] No such file or directory: '/opt/splunk/var/lib/splunk/modinputs/mscs_azure_event_hub/Endpoint=sb://mynamespace.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=xxxxxxxxx-myeventhub-myconsumergroup.v1.ckpt'
 
 
Labels (1)

maplebuddy
Explorer

@Knightrider1234  I found a solution. I searched and couldn't find an answer so I will post this here for anyone else that is experiencing the issue above.
I initially started with the Microsoft Azure Add-on for Splunk. I found "The Event Hub input has been deprecated in this add-on. Please use the Splunk supported Splunk Add-on for Microsoft Cloud Services to ingest Event Hub data" on the inputs page of the app.

I then figured out the difference:

Microsoft Azure Add on for Splunk (now deprecated)
-> ingests Eventhubs through old ClientSecret String

Splunk Add-on for Microsoft Cloud Services
-> ingests Eventhubs through modern Azure-AD app with Reader rights into eventhub

You must navigate to Subscriptions -> your subscription -> Access Control (IAM) -> Select (+Add) and give the Splunk app Azure Event Hubs Data Receiver. In the Event Hub set-up of the Splunk Add-on for Microsoft Cloud Services give the FQDN only (e.g. lab-eventhub.servicebus.windows.net) and provide the event-hub name in the following field. This worked for me and I immediately started getting logs in.

Hope this helps!

0 Karma

cdahal
Explorer

Wondering if someone have resolved this issue already as I am having same issue.

0 Karma

n0psl1de
Explorer

I am having the same issue. 

0 Karma

gazoscreek
Path Finder

Hello ... Did you ever get this resolved? I'm running into the same issue. It seems to have something to do with the event_hub_namespace parameter in the config file, but I've not been successful at figuring out what the problem is.

 

Thank you.

0 Karma

KnightRider
Engager

Hi gazoscreek,

Sorry for the late response. I am still having the same issue.  

I am waiting for someone from this community to shed some light.

0 Karma

maplebuddy
Explorer

Hey all, I am experiencing the same issue. @KnightRider @gazoscreek any update or working usecase?

Get Updates on the Splunk Community!

New Case Study Shows the Value of Partnering with Splunk Academic Alliance

The University of Nevada, Las Vegas (UNLV) is another premier research institution helping to shape the next ...

How to Monitor Google Kubernetes Engine (GKE)

We’ve looked at how to integrate Kubernetes environments with Splunk Observability Cloud, but what about ...

Index This | How can you make 45 using only 4?

October 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...