Getting Data In

How can we log and containerize the logs using Kubernetes and Splunk?


Hi Folks;

I came across this post on github and it had some fantastic options for pulling data from K8s/Docker into Splunk. It seems that the 'easy' approach here is to leverage the integration of K8S/Redhat with Fluentd, and then push the data into splunk. I was hoping to pick the brain of some of our Splunk experts to see if there is also a way to do a direct to splunk integration. Ideally, our goal is to make sure that the data that comes into splunk is 'containerized' so that it can easily be organized.

I see the docker Splunk logging driver is available, but seems to be the less trusted approach since it doesn't integrate well with K8s.

Splunk Employee
Splunk Employee

UPDATE: Splunk Connect for Kubernetes is the current Splunk option as of Jan 2020

Check out the work we are doing on our Open Source docker-itmonitoring project, including log, metadata and prototype app.

We are playing with any and all integrations, but the good ol UF does some great things as a daemonset! Check out the TAs we have started building and feel free to contribute your experiences as we shape the future of Splunk's docker/k8s support moving forward!

- MattyMo
0 Karma

New Member

@mmodestino - I had a look at Github. Running UFs on K8s or Openshift hosts either normally or as a container means I’d have to run it as root rather than a Splunk user. For Openshift, the /var/log/container directory is a symlink to the /var/lib/docker where the actual logs are stored. Is it advisable to set the ACL on /var/lib/docker with read access to the splunk user? I’ve seen a similar suggestions on other posts where a no root user needs access to log directories owned by root.

0 Karma

Splunk Employee
Splunk Employee

Technically, this is easily avoided with daemonsets and letting the container run with privs, but I am looking to confirm with my friends at Red Hat, as I see no reason that some creative linux'ing cant make that happen.

- MattyMo
0 Karma


We just published first version of our application "Monitoring Kubernetes" ( and collector ( Please take a look on our manual how to get started


You can always use Splunk Universal Forwarder. Just create a k8s daemonset with it. However, this approach has some drawbacks compared to the Fluentd-based approach.

  1. Tight coupling between logging agent and Splunk: Log data has many use cases. Also, you may want to send the logs into other systems like Amazon S3, Google Cloud Storage, etc. For such use cases, Fluentd-based approach is more robust because Fluentd Enterprise can send your container logs into multiple systems with a unified log pipeline.
  2. Cost Management: You may want to pre-process and filter logs you send to Splunk. This is very easy with Fluentd. More over, you can extend and implement custom filters easily to meet with your unique needs/handle unique log formats. You do not lose any log data even if you filter them in Fluentd: simply redirect all raw logs into a much cheaper cold storage like Amazon Glacier.
  3. k8s/Docker compatibility: Fluentd is an official Cloud Native Computing Foundation project, and as such, it collaborates closely with the rest of the container/container orchestration ecosystem to ensure forward compatibility with Docker/k8s.

Full Disclosures: I work at Treasure Data where we offer Fluentd Enterprise, a commercial offering built around Fluentd. If you are interested, check out the website and the Splunk optimization module.

Splunk Employee
Splunk Employee

Thanks for the answer and the honesty/disclosure! Very cool of you.

0 Karma
Get Updates on the Splunk Community!

Splunk Observability Cloud | Customer Survey!

If you use Splunk Observability Cloud, we invite you to share your valuable insights with us through a brief ...

.conf23 | Get Your Cybersecurity Defense Analyst Certification in Vegas

We’re excited to announce a new Splunk certification exam being released at .conf23! If you’re going to Las ...

Starting With Observability: OpenTelemetry Best Practices

Tech Talk Starting With Observability: OpenTelemetry Best Practices Tuesday, October 17, 2023   |  11AM PST / ...