I am trying to use the ai prompt in Splunk Machine Learning Toolkit 5.6.0 in order to use Llama Guard Model 4,note that it does not require an access token for it. I am trying to test out the following prompt and I keep getting the following error. Please assist with any help or any correct format to test for using Llama Guard 4.
Test Prompt:
index=_internal log_level=error | table _time _raw | ai prompt="Please summarise these error messages and describe why I might receive them: {_raw}"
Error Message:
SearchMessage orig_component=SearchOrchestrator sid=[sid] message_key= message=Error in 'ai' command: No default model was found.
Hi,
You must provide a `model` parameter in your `| ai` command.
https://docs.splunk.com/Documentation/MLApp/5.6.0/User/Aboutaicommand#Parameters_for_the_ai_command
Also, I assume that you have configured the required details in the Connections Management page https://docs.splunk.com/Documentation/MLApp/5.6.0/User/Aboutaicommand#Connection_Management_page
hi @Nadeen_98
Can you confirm that in the the Connection Management settings you have the "Set as Default" checked for one of the models?
If so, please can you check the logs for any errors:
index="_internal" (source="*mlspl.log" OR sourcetype="mlspl" OR source="*python.log*")
🌟 Did this answer help you? If so, please consider:
Your feedback encourages the volunteers in this community to continue contributing
Thanks for your help @livehybrid
I've added Groq as the default provider but still another issue occurs.
Revised Prompt:
index=_internal log_level IN ("ERROR","WARNING", "WARN")
| table _time _raw
| ai prompt="please explain these error message to me: {_raw}" provider="Groq" model="llama3-70b-8192"
Error Message:
RunDispatch has failed: sid=[sid], exit=-1, error=Error in 'ai' command: The provider: '"Groq"' is invalid. Please check the configuration
The <SPLUNK_HOME>/var/log/splunk/mlspl.log may give you more details about the error.