Getting Data In

How to index Kubernetes STDOUT data in Splunk?

dhavamanis
Builder

Need your help,

Can you please tell us, how to receive Kubernetes STDOUT data in Splunk Enterprise? Kubernetes is running on CoreOS.

Thank you,

0 Karma

pkisplunk
Explorer

We used Fluentd with Splunk cloud and it worked seamlessly.

If anyone using Splunk Cloud sees this answer - the methods above are applicable both to the Enterprise version as well as the Cloud.

mattymo
Splunk Employee
Splunk Employee

Hey dhavamanis,

We have released Splunk Connect for Kubernetes!

It used fluentd and heapster to get you logs metrics and metadata, and is Splunk built and supported!

Check it out!

https://github.com/splunk/splunk-connect-for-kubernetes

- MattyMo

abdulc
New Member

is there away to trim colors from fluentd-hec similar to what is suggested in https://github.com/mattheworiordan/fluent-plugin-color-stripper on a pod level ?

0 Karma

outcoldman
Communicator

We just published first version of our application "Monitoring Kubernetes" (https://splunkbase.splunk.com/app/3743/) and collector (https://www.outcoldsolutions.com). Please take a look on our manual how to get started https://www.outcoldsolutions.com/docs/monitoring-kubernetes/

0 Karma

agup006
Explorer

Hi Dhavamanis,

Fluentd is one of the preferred logging layers of Kubernetes, and using Fluentd is preferred for Kubernetes data routing to Splunk, Elasticsearch, Kafka, Amazon S3, etc. Using a Kubernetes Daemon Set you can deploy a Fluentd node inside of every Kubernetes Node and have the configuration to then route stdout, stderr, etc. data into Elasticsearch, Splunk, etc. Additionally, Fluentd has additional capabilities to append information about the Kubernetes Pod, Namespace, Node.

Documentation of Kubernetes Daemon Set: https://kubernetes.io/docs/concepts/workloads/controllers/daemonset/
Documentation on Fluentd Daemon Set:http://docs.fluentd.org/v0.12/articles/kubernetes-fluentd
Documentation on Fluentd-Elasticsearch Daemon Set:http://docs.fluentd.org/v0.12/articles/kubernetes-fluentd#logging-to-elasticsearch

If you want a supported Splunk plugin and a Kubernetes -> Splunk DaemonSet , Fluentd Enterprise offers SLA support for sending data to Splunk Enterprise and Splunk Cloud. If you want more information you can email me at A@ Treasuredata.com and find more information here: https://fluentd.treasuredata.com

Thanks,
Anurag

mattymo
Splunk Employee
Splunk Employee

what path did you end up on?

- MattyMo
0 Karma

MuS
Legend

Hi dhavamanis,

This is not a Splunk problem, but a Kubernetes problem ..... nevertheless a quick google search revealed this:

When a cluster is created, the standard output and standard error output of each container can be ingested using a Fluentd agent running on each node into either Google Cloud Logging or into Elasticsearch and viewed with Kibana.

From here https://github.com/kubernetes/kubernetes/blob/master/docs/getting-started-guides/logging.md

If you can get into ES/Kibana you can get it into Splunk 😉

Hope this helps and no I have no idea what Kubernetes is an cannot be of further help 🙂

cheers, MuS

0 Karma

vam111
New Member

What's the latest way to forward the K8s application (at containers in Pods) level logs to Splunk?

I want to understand, how the Pull-based method for data fetching from Google K8s cluster - container level can be configured for Splunk?

0 Karma
Get Updates on the Splunk Community!

Splunk is Nurturing Tomorrow’s Cybersecurity Leaders Today

Meet Carol Wright. She leads the Splunk Academic Alliance program at Splunk. The Splunk Academic Alliance ...

Part 2: A Guide to Maximizing Splunk IT Service Intelligence

Welcome to the second segment of our guide. In Part 1, we covered the essentials of getting started with ITSI ...

Part 1: A Guide to Maximizing Splunk IT Service Intelligence

As modern IT environments continue to grow in complexity and speed, the ability to efficiently manage and ...