Hi Folks;
I came across this post on github https://github.com/kubernetes/kubernetes/issues/24677 and it had some fantastic options for pulling data from K8s/Docker into Splunk. It seems that the 'easy' approach here is to leverage the integration of K8S/Redhat with Fluentd, and then push the data into splunk. I was hoping to pick the brain of some of our Splunk experts to see if there is also a way to do a direct to splunk integration. Ideally, our goal is to make sure that the data that comes into splunk is 'containerized' so that it can easily be organized.
I see the docker Splunk logging driver is available, but seems to be the less trusted approach since it doesn't integrate well with K8s.
UPDATE: Splunk Connect for Kubernetes is the current Splunk option as of Jan 2020
https://github.com/splunk/splunk-connect-for-kubernetes
Check out the work we are doing on our Open Source docker-itmonitoring project, including log, metadata and prototype app.
https://github.com/splunk/docker-itmonitoring/tree/7.0.0-k8s
We are playing with any and all integrations, but the good ol UF does some great things as a daemonset! Check out the TAs we have started building and feel free to contribute your experiences as we shape the future of Splunk's docker/k8s support moving forward!
@mmodestino - I had a look at Github. Running UFs on K8s or Openshift hosts either normally or as a container means I’d have to run it as root rather than a Splunk user. For Openshift, the /var/log/container directory is a symlink to the /var/lib/docker where the actual logs are stored. Is it advisable to set the ACL on /var/lib/docker with read access to the splunk user? I’ve seen a similar suggestions on other posts where a no root user needs access to log directories owned by root.
Technically, this is easily avoided with daemonsets and letting the container run with privs, but I am looking to confirm with my friends at Red Hat, as I see no reason that some creative linux'ing cant make that happen.
We just published first version of our application "Monitoring Kubernetes" (https://splunkbase.splunk.com/app/3743/) and collector (https://www.outcoldsolutions.com). Please take a look on our manual how to get started https://www.outcoldsolutions.com/docs/monitoring-kubernetes/
You can always use Splunk Universal Forwarder. Just create a k8s daemonset with it. However, this approach has some drawbacks compared to the Fluentd-based approach.
Full Disclosures: I work at Treasure Data where we offer Fluentd Enterprise, a commercial offering built around Fluentd. If you are interested, check out the website and the Splunk optimization module.
Thanks for the answer and the honesty/disclosure! Very cool of you.