All Apps and Add-ons

Failed to load algorithm mltkc.MLTKContainer.

williame
Observer

I am trying to set up Deep Learning toolkit on Splunk Cloud using Azure Kubernetes Service. I am able to connect to the containers and launch jupyter notebook, however when I try to execute an example model, the following error message is recieved,

Error in 'fit' command: Error while initializing algorithm "MLTKContainer": Failed to load algorithm "mltkc.MLTKContainer".

The algorithm that is used in the example is present in the app/models folder in the Jupyter notebook.

Any thoughts on what might be wrong here?

Labels (2)
0 Karma

williame
Observer

UPDATE: I checked the search log and it looks like there is a bug:

11-30-2021 09:50:56.490 ERROR ChunkedExternProcessor [8184 ChunkedExternProcessorStderrLogger] - stderr: ImportError: cannot import name 'splitattr' from 'urllib.request' (/opt/splunk/etc/apps/Splunk_SA_Scientific_Python_linux_x86_64/bin/linux_x86_64/lib/python3.8/urllib/request.py)

This is the search.log file from the Error: 

https://pastebin.com/irFnXiBr

0 Karma

williame
Observer

UPDATE: Tried running DLTK version 3.7.0 on an Enterprise instance, since only 3.6.0 is available for Splunk Cloud. This fixed the issue.

 

So currently we managed to run DLTK successfully on Splunk Enterprise, but we are still looking for a solution for Splunk Cloud, since 3.7.0 is not available yet. Seems like the problem is a deprecated module imported in mltkc/MLTKContainer.py (urllib.request.splitattr now found in urllib.parse or under a new name?).

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...