All Apps and Add-ons

How to send Docker containers logs to Splunk?

New Member

We're running two processes Nginx and an application inside Docker container. Both the process lifecycles are managed through the Supervisor. I want to understand if I use Docker Log driver for Splunk, will Splunk be able to automatically forward both the process logs indexer.

Tags (1)
0 Karma


It is a hard question, as it depends on how you setup the Supervisor.

Let me start from the beginning. Running multiple processes in the same container is an anti-pattern. Try to avoid it as much as possible. Kubernetes, for example, have a great solution for your case, where they can deploy two containers in the same Pod and just setup communication between the containers on the same loopback network interface (, so for the processes, it will look like they are running in the same container. See for details.

Unfortunately, Docker does not have this out of the box, and configuring the same can be problematic.

There are two options. You can redirect stdout and stderr from the processes to the supervisord stdout/stderr (took an example from


In that case, all the logs will end up in container stdout/stderr. You can use Docker Logger Drivers to send these logs to Splunk. The problem with this approach - all the logs, from supervisord, from nginx and all other processes, will end up in container stdout/stderr, it could be hard to distinguish them.

Another approach is to have a data volume for the application logs. Attach the same volume to the Splunk Universal Forwarder or another process (like our collector and forward these logs as you usually do. Don't forget to set the log rotation.

0 Karma

0 Karma
Did you miss .conf21 Virtual?

Good news! The event's keynotes and many of its breakout sessions are now available online, and still totally FREE!