<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: How can we log and containerize the logs using Kubernetes and Splunk? in Getting Data In</title>
    <link>https://community.splunk.com/t5/Getting-Data-In/How-can-we-log-and-containerize-the-logs-using-Kubernetes-and/m-p/357071#M65213</link>
    <description>&lt;P&gt;Thanks for the answer and the honesty/disclosure! Very cool of you.&lt;/P&gt;</description>
    <pubDate>Fri, 30 Jun 2017 12:27:32 GMT</pubDate>
    <dc:creator>sloshburch</dc:creator>
    <dc:date>2017-06-30T12:27:32Z</dc:date>
    <item>
      <title>How can we log and containerize the logs using Kubernetes and Splunk?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-can-we-log-and-containerize-the-logs-using-Kubernetes-and/m-p/357069#M65211</link>
      <description>&lt;P&gt;Hi Folks;&lt;/P&gt;

&lt;P&gt;I came across this post on github &lt;A href="https://github.com/kubernetes/kubernetes/issues/24677"&gt;https://github.com/kubernetes/kubernetes/issues/24677&lt;/A&gt; and it had some fantastic options for pulling data from K8s/Docker into Splunk. It seems that the 'easy' approach here is to leverage the integration of K8S/Redhat with Fluentd, and then push the data into splunk.  I was hoping to pick the brain of some of our Splunk experts to see if there is also a way to do a direct to splunk integration.  Ideally, our goal is to make sure that the data that comes into splunk is 'containerized' so that it can easily be organized.&lt;/P&gt;

&lt;P&gt;I see the docker Splunk logging driver is available, but seems to be the less trusted approach since it doesn't integrate well with K8s.&lt;/P&gt;</description>
      <pubDate>Mon, 01 May 2017 15:47:11 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-can-we-log-and-containerize-the-logs-using-Kubernetes-and/m-p/357069#M65211</guid>
      <dc:creator>paimonsoror</dc:creator>
      <dc:date>2017-05-01T15:47:11Z</dc:date>
    </item>
    <item>
      <title>Re: How can we log and containerize the logs using Kubernetes and Splunk?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-can-we-log-and-containerize-the-logs-using-Kubernetes-and/m-p/357070#M65212</link>
      <description>&lt;P&gt;You can always use Splunk Universal Forwarder. Just create a k8s daemonset with it. However, this approach has some drawbacks compared to the Fluentd-based approach.&lt;/P&gt;

&lt;OL&gt;
&lt;LI&gt;&lt;STRONG&gt;Tight coupling between logging agent and Splunk&lt;/STRONG&gt;: Log data has many use cases. Also, you may want to send the logs into other systems like Amazon S3, Google Cloud Storage, etc. For such use cases, Fluentd-based approach is more robust because Fluentd Enterprise can send your container logs into multiple systems with a unified log pipeline.&lt;/LI&gt;
&lt;LI&gt;&lt;STRONG&gt;Cost Management&lt;/STRONG&gt;: You may want to pre-process and filter logs you send to Splunk. This is very easy with Fluentd. More over, you can extend and implement custom filters easily to meet with your unique needs/handle unique log formats. You do not lose any log data even if you filter them in Fluentd: simply redirect all raw logs into a much cheaper cold storage like Amazon Glacier.&lt;/LI&gt;
&lt;LI&gt;&lt;STRONG&gt;k8s/Docker compatibility&lt;/STRONG&gt;: Fluentd is an official Cloud Native Computing Foundation project, and as such, it collaborates closely with the rest of the container/container orchestration ecosystem to ensure forward compatibility with Docker/k8s.&lt;/LI&gt;
&lt;/OL&gt;

&lt;P&gt;Full Disclosures: I work at Treasure Data where we offer Fluentd Enterprise, a commercial offering built around Fluentd. If you are interested, check out &lt;A href="https://fluentd.treasuredata.com"&gt;the website&lt;/A&gt; and the &lt;A href="https://fluentd.treasuredata.com/splunk-optimize/"&gt;Splunk optimization module&lt;/A&gt;.&lt;/P&gt;</description>
      <pubDate>Fri, 30 Jun 2017 01:55:28 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-can-we-log-and-containerize-the-logs-using-Kubernetes-and/m-p/357070#M65212</guid>
      <dc:creator>kiyototamura</dc:creator>
      <dc:date>2017-06-30T01:55:28Z</dc:date>
    </item>
    <item>
      <title>Re: How can we log and containerize the logs using Kubernetes and Splunk?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-can-we-log-and-containerize-the-logs-using-Kubernetes-and/m-p/357071#M65213</link>
      <description>&lt;P&gt;Thanks for the answer and the honesty/disclosure! Very cool of you.&lt;/P&gt;</description>
      <pubDate>Fri, 30 Jun 2017 12:27:32 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-can-we-log-and-containerize-the-logs-using-Kubernetes-and/m-p/357071#M65213</guid>
      <dc:creator>sloshburch</dc:creator>
      <dc:date>2017-06-30T12:27:32Z</dc:date>
    </item>
    <item>
      <title>Re: How can we log and containerize the logs using Kubernetes and Splunk?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-can-we-log-and-containerize-the-logs-using-Kubernetes-and/m-p/357072#M65214</link>
      <description>&lt;P&gt;We just published first version of our application "Monitoring Kubernetes" (&lt;A href="https://splunkbase.splunk.com/app/3743/"&gt;https://splunkbase.splunk.com/app/3743/&lt;/A&gt;) and collector (&lt;A href="https://www.outcoldsolutions.com"&gt;https://www.outcoldsolutions.com&lt;/A&gt;). Please take a look on our manual how to get started &lt;A href="https://www.outcoldsolutions.com/docs/monitoring-kubernetes/"&gt;https://www.outcoldsolutions.com/docs/monitoring-kubernetes/&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 10 Oct 2017 04:23:25 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-can-we-log-and-containerize-the-logs-using-Kubernetes-and/m-p/357072#M65214</guid>
      <dc:creator>outcoldman</dc:creator>
      <dc:date>2017-10-10T04:23:25Z</dc:date>
    </item>
    <item>
      <title>Re: How can we log and containerize the logs using Kubernetes and Splunk?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-can-we-log-and-containerize-the-logs-using-Kubernetes-and/m-p/357073#M65215</link>
      <description>&lt;P&gt;UPDATE: Splunk Connect for Kubernetes is the current Splunk option as of Jan 2020&lt;BR /&gt;
&lt;A href="https://github.com/splunk/splunk-connect-for-kubernetes"&gt;https://github.com/splunk/splunk-connect-for-kubernetes&lt;/A&gt;&lt;/P&gt;

&lt;P&gt;Check out the work we are doing on our Open Source docker-itmonitoring project, including log, metadata and prototype app.&lt;/P&gt;

&lt;P&gt;&lt;A href="https://github.com/splunk/docker-itmonitoring/tree/7.0.0-k8s"&gt;https://github.com/splunk/docker-itmonitoring/tree/7.0.0-k8s&lt;/A&gt;&lt;/P&gt;

&lt;P&gt;We are playing with any and all integrations, but the good ol UF does some great things as a daemonset! Check out the TAs we have started building and feel free to contribute your experiences as we shape the future of Splunk's docker/k8s support moving forward!&lt;/P&gt;</description>
      <pubDate>Sat, 06 Jan 2018 21:10:43 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-can-we-log-and-containerize-the-logs-using-Kubernetes-and/m-p/357073#M65215</guid>
      <dc:creator>mattymo</dc:creator>
      <dc:date>2018-01-06T21:10:43Z</dc:date>
    </item>
    <item>
      <title>Re: How can we log and containerize the logs using Kubernetes and Splunk?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-can-we-log-and-containerize-the-logs-using-Kubernetes-and/m-p/357074#M65216</link>
      <description>&lt;P&gt;@mmodestino - I had a look at Github. Running UFs on K8s or Openshift hosts either normally or as a container means I’d have to run it as root rather than a Splunk user. For Openshift, the  /var/log/container directory is a symlink to the /var/lib/docker where the actual logs are stored. Is it advisable to set the ACL on /var/lib/docker with read access to the splunk user? I’ve seen a similar suggestions on other posts where a no root user needs access to log directories owned by root. &lt;/P&gt;</description>
      <pubDate>Sun, 18 Mar 2018 16:37:17 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-can-we-log-and-containerize-the-logs-using-Kubernetes-and/m-p/357074#M65216</guid>
      <dc:creator>sayeed101</dc:creator>
      <dc:date>2018-03-18T16:37:17Z</dc:date>
    </item>
    <item>
      <title>Re: How can we log and containerize the logs using Kubernetes and Splunk?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-can-we-log-and-containerize-the-logs-using-Kubernetes-and/m-p/357075#M65217</link>
      <description>&lt;P&gt;Technically, this is easily avoided with daemonsets and letting the container run with privs, but I am looking to confirm with my friends at Red Hat, as I see no reason that some creative linux'ing cant make that happen. &lt;/P&gt;</description>
      <pubDate>Fri, 27 Apr 2018 13:12:59 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-can-we-log-and-containerize-the-logs-using-Kubernetes-and/m-p/357075#M65217</guid>
      <dc:creator>mattymo</dc:creator>
      <dc:date>2018-04-27T13:12:59Z</dc:date>
    </item>
  </channel>
</rss>

