Security

AI integration with Splunk- How can I encapsulate the token value example:  '$SourceCode_Tok$' ?

Marino25
Observer

Hello,

I am working on a project to integrate Splunk with a LLM model.  I have created an app that search vulnerabilities within source code and report back findings as well as the modified source code with the corrected findings.

Now, here's my issue, it works for JavaScript, Java, PHP, and many others, but at the moment I upload Python or Simple XML, the system cannot process it as it is thinking is part of the actual dashboard code.

My question is, how can I encapsulate the token value example:  '$SourceCode_Tok$' for the system to interpret it correctly as external code to be analyzed.

Thank you all!

Labels (1)
0 Karma
Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

Using the Splunk Threat Research Team’s Latest Security Content

REGISTER HERE Tech Talk | Security Edition Did you know the Splunk Threat Research Team regularly releases ...

SplunkTrust | 2024 SplunkTrust Application Period is Open!

It's that time again, folks! That's right, the application/nomination period for the 2024 SplunkTrust is ...