Security

AI integration with Splunk- How can I encapsulate the token value example:  '$SourceCode_Tok$' ?

Marino25
Observer

Hello,

I am working on a project to integrate Splunk with a LLM model.  I have created an app that search vulnerabilities within source code and report back findings as well as the modified source code with the corrected findings.

Now, here's my issue, it works for JavaScript, Java, PHP, and many others, but at the moment I upload Python or Simple XML, the system cannot process it as it is thinking is part of the actual dashboard code.

My question is, how can I encapsulate the token value example:  '$SourceCode_Tok$' for the system to interpret it correctly as external code to be analyzed.

Thank you all!

Labels (1)
0 Karma
Get Updates on the Splunk Community!

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

Get the T-shirt to Prove You Survived Splunk University Bootcamp

As if Splunk University, in Las Vegas, in-person, with three days of bootcamps and labs weren’t enough, now ...

Wondering How to Build Resiliency in the Cloud?

IT leaders are choosing Splunk Cloud as an ideal cloud transformation platform to drive business resilience,  ...