Splunk Tech Talks
Deep-dives for technical practitioners.

Splunk MCP & Agentic AI: Machine Data Without Limits

DayaSCanales
Splunk Employee
Splunk Employee

Screenshot 2026-01-29 200032.png

 

Discover how the Splunk Model Context Protocol (MCP) Server can revolutionize the way your organization uses machine data.

In this Tech Talk, Splunk experts reveal how MCP empowers Agentic AI to easily tap into live data, eliminating complexity and opening up new opportunities for insight and automation. Whether you’re new to AI or looking to maximize your data’s value, this session is designed to inspire and inform.

Ready to see your data in action? Watch the replay!

In this session, you’ll learn:

  • What Agentic AI is and how it works with Splunk MCP
  • Essentials for installing and configuring Splunk’s MCP Server
  • Use cases demonstrating MCP in action with autonomous AI systems
DayaSCanales
Splunk Employee
Splunk Employee

Here are a few top of mind questions from the live Tech Talk

 

Q. What is the charge back model to use Splunk MCP server?

A. MCP is a free app and any usage is directly attributed using the token tied to the user.

DayaSCanales_0-1770336401949.png

Q. How do we choose to allow what LLMs can integrate and connect from a Security perspective?

A. Ultimately MCP server is like an API gateway. The clients which connect to Splunk MCP are completely owned by customers and you would need to do your own security review to make sure you trust the LLM that connects to the MCP server.

DayaSCanales_1-1770336401953.png

Q. How does Splunk ensure safety of Splunk interactions with MCP servers?

A. Splunk enforces guardrails on tools. Specifically run Splunk query so you can only perform read operations. Also queries will not run longer than 1 min in case the LLM writes a bad query.

DayaSCanales_1-1770336401953.png

Q. For Splunk Cloud customers, does using the MCP server or the Agentic AI have an additional cost? More ingestion being done? More storage being needed?

A. There is no direct link for extra storage being needed since the MCP server is leveraging the pre-existing data on the Splunk instance. Only direct correlation is an increase in SVA or Splunk searches. This is billed according to your Splunk license.

DayaSCanales_1-1770336401953.png

Q. What do you thing are the maturity pre-requisites for using agentic approaches?

A. There are several factors to determine if agents make sense. Starting at your technical know how, available AI systems and a defined objective makes sense. Please keep in mind the more agency an agent has the more guardrails or restrictions should be in place.

DayaSCanales_1-1770336401953.png

Q. Currently manage Splunk RBAC using LDAP. Because MCP doesn’t appear to have built-in RBAC, does that imply that index-level access when using MCP needs to be enforced by the MCP client on a per-user or per-team basis? If that’s the case, what’s the best way to scale this across dozens of teams?

A. MCP uses existing Splunk Tokens tied to users access and capabilities so existing RBAC setup for your users should work.

DayaSCanales_1-1770336401953.png

Q. MCP Server logs are auto forwarded for indexing?

A. Logs that result from queries/API calls being run through MCP are indexed in the stack.

DayaSCanales_1-1770336401953.png

Q. Why is the Splunk MCP built as an app rather than a standalone codebase that customers can deploy independently? 

Building it as an app introduces several limitations:

- Resource constraints tied to search head capacity

- Search timeout restrictions that may interrupt long-running queries

- Row limits that prevent access to complete datasets A standalone deployment would provide customers with greater flexibility in resource allocation and eliminate these built- in constraints

A. Great question! Definitely would be nicer if it was a binary to download. There were plans to make that available for more technical users, I can't speak for product but I know there are MCP gateways and tooling built into the app. By leveraging the Splunk base app system they can manage and maintain capabilities, tooling, etc. more easily.

DayaSCanales_1-1770336401953.png

Q. Can you please explain how are you giving the context to which index it has to go and get the data?

A. In terms of context we can either be prescriptive and prompt the LLM to only look for specific indexes, or let it reason by adding in knowledge objects, common index names. There are many solutions to guide an LLM.

DayaSCanales_1-1770336401953.png

Q. What are your thoughts on using using SOAR to be the orchestrator that calls upon agents in a playbook setting?

A. Probably better to do the inverse. Let an agent act as a judgement layer on top of SOAR it really depends on your objective. Adding in LLM capabilities into a SOAR playbook has GREAT value!

DayaSCanales_1-1770336401953.png

Q. Are there any specific limitations or complications in integrating AI for cloud customers?

A. Easier for cloud customers since SAIA tools are also available.

DayaSCanales_1-1770336401953.png

Q. What is the heart of Splunk AI? Is there a Splunk LLM being used or any open source LLM are used?

A. Splunk AI assistant by default uses a fine tuned model that we develop internally.

DayaSCanales_1-1770336401953.png

Q. Have you seen cases where the quality of results based on the generated SPL changes with the LLM running?

A. Yes, different LLMs will need some adjustments via a system prompt or a RAG reasoning and tools are important I will also add that LLMs will utilize the Generate SPL tool rather then creating the SPL themselves.

DayaSCanales_1-1770336401953.png

Q. What about Splunk internals , for large systems , help the admins focus and run a better platform (on-prem)?

A. This is something you can do - basically, your AI is able to access any data (assuming it has the right permissions) there is also work on AI assistant built in Splunk to reduce MTTR on both Security and Observability.

DayaSCanales_1-1770336401953.png

Q. Seems like MCP is a distant experience for our users, while Assist is integrated into the Splunk UI. What is going to bring all this together into a unified user experience?

A. MCP is available now! The approach we recommend would be AI assistants such as SPL Assistant will expedite user experience. Using an MCP host to connect to MCP allows for a conversational approach with a Splunk instance which is great. But like any systems knowing the clear use case, requirements and ability to maintain is important. Think of analyzing the ROI of an agent vs an automation platform, both show value in distinct places.

DayaSCanales_1-1770336401953.png

Q. Why MCP server is supported in a standalone Search Head rather than the Search Head Cluster?

A. This is something that we are still working on, please be on the lookout for future releases.

DayaSCanales_1-1770336401953.png

Q. Any agent builder other than n8n you would recommend ?

A. Depending upon your use case and flexibility required you may not need any agent builder. You might just want a simple LLM MCP host like Claude. N8n offers a lot of flexibility to create complex workflows. Splunk AITK also launched an agent builder which is something I would recommend trying out.

DayaSCanales_1-1770336401953.png

Q. I have worked with MCP tool for Splunk but I usually end up with token limit error, since there is a lot of data...?

A. This is a fair point, creating systems that are helpful is important. If you have too much data then creating limits might be necessary such as index restrictions.

DayaSCanales_1-1770336401953.png

Contributors
Get Updates on the Splunk Community!

Build the Future of Agentic AI: Join the Splunk Agentic Ops Hackathon

AI is changing how teams investigate incidents, detect threats, automate workflows, and build intelligent ...

[Puzzles] Solve, Learn, Repeat: Character substitutions with Regular Expressions

This challenge was first posted on Slack #puzzles channelFor BORE at .conf23, we had a puzzle question which ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...