Security

AI integration with Splunk- How can I encapsulate the token value example:  '$SourceCode_Tok$' ?

Marino25
Observer

Hello,

I am working on a project to integrate Splunk with a LLM model.  I have created an app that search vulnerabilities within source code and report back findings as well as the modified source code with the corrected findings.

Now, here's my issue, it works for JavaScript, Java, PHP, and many others, but at the moment I upload Python or Simple XML, the system cannot process it as it is thinking is part of the actual dashboard code.

My question is, how can I encapsulate the token value example:  '$SourceCode_Tok$' for the system to interpret it correctly as external code to be analyzed.

Thank you all!

Labels (1)
0 Karma

syaganti
Loves-to-Learn Everything

I'm hoping you've found a solution. I'm working on a similar project where I created an app in splunk/etc/apps/my-app with a .py file in the bin folder and a .conf file in the default folder. Initially, when I ran the command <| mycommand "hello"> in Splunk, it outputted a response that I had hardcoded in my .py file. However, after updating the script to generate responses via a large language model, I started encountering the following error.

Error in 'mycommand' command: External search command exited unexpectedly with non-zero error code 1.
The search job has failed due to an error. You may be able view the job in the Job Inspector.

Please help me with this. 
Thanks in advance

#splunk #LLM's
0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...