All Apps and Add-ons

Kubernetes/Docker json logs

sylvainlectra
Explorer

I'm currently trying to setup the streaming of the kubernetes / docker logs into Splunk.

As you might now docker stores his container logs into files with a json syntax :

  {log: "this is one log line", stream: "stdout", time: "2017-10-30T21:30:19.379796735Z"}
  {log: "this is another log line", stream: "stdout", time: "2017-10-30T21:30:19.45Z"}

I have setup the Universal Splunk Forwarder to ingest those log files and send them to the indexers but the feedback I have used from the developers is that this is completely unreadable and I tend to agree with them ...

I've tried everything in the Search requests to be able to remodel the event object and discard everything but the log field of the docker json without success.

Another problem I have is that mixed sources (from different software) are written in those logs so different format can end up in the log field: raw text, json (escaped by docker) ... etc.

The first thing I'd like to do is to extract the log field of the docker json and send only that to splunk.

Then I'd like that to apply the correct source type to the log data, i.e. : json, access combined or anything else.

Regards.

0 Karma
1 Solution

outcoldman
Communicator

Did you have a chance to look on our application Monitoring Kubernetes and our collector for kubernetes? https://splunkbase.splunk.com/app/3743/

Our collector parses logs in expected format, also joins the multiline log lines and enriches the data with kubernetes metadata. Additionally to that it collects metrics from hosts, including containers, pods and processes inside these containers and also on the host.

I am developer myself and that setup helps me to debug, diagnose a lot of issues.

If you will have any issue - contact me on denis@outcoldsolutions.com

View solution in original post

0 Karma

mattymo
Splunk Employee
Splunk Employee

Hey sylvainlectra!

UPDATE (Dec 17, 2018) : USE SPLUNK CONNECT FOR KUBERNETES!! https://github.com/splunk/splunk-connect-for-kubernetes and check out our beta for app for containers! https://www.splunk.com/en_us/form/splunk-insights-for-containers.html

UPDATE: I have pushed a working props.conf solution to our docker-itmonitoring github that will allow you to cleanup docker logs and apply exisitng sourcetypes/extractions natively in splunk:

https://github.com/splunk/docker-itmonitoring/blob/7.0.0-k8s/app-k8s/default/props.conf

This will allow us to strip the json off the logs, as well as support multiline logging and multiple line breaking schemes for each type of pod type/log type.

Let me know if you'd like to see it in action!

I am currently working in the same area, and yeah, json wrapped logs are a pain to deal with.

How are your devs interacting with the logs? Searching in searching raw logs from the search bar?

Have you tried curating views for your devs so that they are looking at a dashboard panel of events by namespace, or pod and using | table _time stream log ?

alt text

It's not perfect, (doesn't help with extractions), but depending on how the end users actually use the logs, might buy you some time, or solve their pain.

The other thing I am trying is to simply use sed to drop the wrapper and only keep the log line. That would be done with props and transforms, and also isn't fully ideal as now we will use compute power to throw data away...once i nail the config I used i'll post it.

As for the different data sources, you will likely need to look at one of the existing TAs that does sourcetype renaming on syslog. Maybe check out Juniper app, or the like...where the data would come in as sourcetype kubernetes, then based on regex match of the log, you would rename it and apply a new sourcetype.

Feel free to come find me on the Slack chat in #kubernetes or #docker! Would love client feedback as we look at building solutions in this area.

- MattyMo
0 Karma

outcoldman
Communicator

Did you have a chance to look on our application Monitoring Kubernetes and our collector for kubernetes? https://splunkbase.splunk.com/app/3743/

Our collector parses logs in expected format, also joins the multiline log lines and enriches the data with kubernetes metadata. Additionally to that it collects metrics from hosts, including containers, pods and processes inside these containers and also on the host.

I am developer myself and that setup helps me to debug, diagnose a lot of issues.

If you will have any issue - contact me on denis@outcoldsolutions.com

0 Karma

sylvainlectra
Explorer

Well, why do yourself something someone has already done.

It's seems to be everything that I wanted.

Thanks.

0 Karma
Get Updates on the Splunk Community!

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...

Last Chance to Submit Your Paper For BSides Splunk - Deadline is August 12th!

Hello everyone! Don't wait to submit - The deadline is August 12th! We have truly missed the community so ...