Security

AI integration with Splunk- How can I encapsulate the token value example:  '$SourceCode_Tok$' ?

Marino25
Observer

Hello,

I am working on a project to integrate Splunk with a LLM model.  I have created an app that search vulnerabilities within source code and report back findings as well as the modified source code with the corrected findings.

Now, here's my issue, it works for JavaScript, Java, PHP, and many others, but at the moment I upload Python or Simple XML, the system cannot process it as it is thinking is part of the actual dashboard code.

My question is, how can I encapsulate the token value example:  '$SourceCode_Tok$' for the system to interpret it correctly as external code to be analyzed.

Thank you all!

Labels (1)
0 Karma
Get Updates on the Splunk Community!

There's No Place Like Chrome and the Splunk Platform

Watch On DemandMalware. Risky Extensions. Data Exfiltration. End-users are increasingly reliant on browsers to ...

The Great Resilience Quest: 5th Leaderboard Update

The fifth leaderboard update for The Great Resilience Quest is out >> 🏆 Check out the ...

Devesh Logendran, Splunk, and the Singapore Cyber Conquest

At this year’s Splunk University, I had the privilege of chatting with Devesh Logendran, one of the winners in ...