Getting Data In

Ingesting application logs from Docker container into Splunk Cloud

tawm_12
Engager

Hi everyone,

I'm seeking advice on the best way to send application logs from our client's Docker containers into a Splunk Cloud instance, and I’d appreciate your input and experiences.

Currently, my leading approach involves using Docker’s "Splunk logging driver" to forward data via the HEC. However, my understanding is that this method primarily sends container-level data rather than detailed application logs.

Another method I came across involves deploying Splunk's Docker image to create a standalone Enterprise container alongside the Universal Forwarder. The idea here is to set up monitors in the forwarder's inputs.conf to send data to the Enterprise instance and then route it via a Heavy Forwarder to Splunk Cloud.

Has anyone successfully implemented either of these approaches—or perhaps a different method—to ingest application logs from Docker containers into Splunk Cloud? Any insights, tips, or shared experiences would be greatly appreciated.

Thanks in advance for your help!

Cheers,

0 Karma

livehybrid
Champion

Hi @tawm_12 ,

The simplest method is often configuring your applications within the containers to log to stdout/stderr and then using the Docker Splunk logging driver to forward these logs directly to your Splunk Cloud HEC endpoint.

If your applications must log to files within the container filesystem, you can use a Universal Forwarder (UF) sidecar container.

Method 1: Docker Logging Driver (Recommended if apps log to stdout/stderr)

  1. Configure your application inside the Docker container to write its logs to standard output (stdout) and standard error (stderr). This is a common practice for containerized applications.
  2. Configure the Docker daemon or individual containers to use the splunk logging driver, pointing it to your Splunk Cloud HEC endpoint and token.

Example docker run command:

docker run \
 --log-driver=splunk \
 --log-opt splunk-token= \
 --log-opt splunk-url=https://:8088 \
 --log-opt splunk-format=json \
 --log-opt splunk-verify-connection=false \
 # Add other options like splunk-sourcetype, splunk-index, tag, etc.
 your-application-image

This method leverages Docker's built-in logging capabilities. The driver captures the container's stdout/stderr streams (which contain your application logs if configured correctly) and forwards them via HEC.

Method 2: Universal Forwarder Sidecar (If apps log to files)

  1. Deploy a Splunk Universal Forwarder container alongside your application container.
  2. Mount the volume containing the application log files into both the application container (for writing) and the UF container (for reading).
  3. Configure the UF container's inputs.conf to monitor the log files within the mounted volume.
  4. Configure the UF container's outputs.conf to forward data to your Splunk Cloud HEC endpoint or an intermediate Heavy Forwarder. Using HEC output from the UF is generally preferred for Splunk Cloud.

Example UF inputs.conf:

[monitor:///path/to/mounted/logs/app.log]
sourcetype = your_app_sourcetype
index = your_app_index
disabled = false

Example UF outputs.conf (for HEC):

[httpout]
uri = https://:8088
hecToken = 
# Consider sslVerifyServerCert = true in production after cert setup
sslVerifyServerCert = false
useACK = true

[tcpout:splunk_cloud_forwarder]
server = : # Use if forwarding via UF->HF->Splunk Cloud S2S
# Other S2S settings...
# disabled = true # Disable if using httpout

The UF actively monitors the specified log files and forwards new events. This is suitable when applications cannot log to stdout/stderr. The UF sidecar runs in parallel with your app container, sharing the log volume.

The Docker logging driver does* send application logs if the application logs are directed to the container's stdout/stderr.

    1. The approach involving a separate Splunk Enterprise container solely for forwarding is overly complex and not typically recommended. A UF can forward directly or via a standard Heavy Forwarder infrastructure.
    2. If you are running in Kubernetes, consider using Splunk Connect for Kubernetes, which streamlines log collection using the OpenTelemetry Collector.
    3. Using HEC for sending data to Splunk Cloud. See Splunk Lantern: Getting Data In - Best Practices for Getting Data into Splunk

🌟 Did this answer help you? If so, please consider:

  • Adding kudos to show it was useful
  • Marking it as the solution if it resolved your issue
  • Commenting if you need any clarification

Your feedback encourages the volunteers in this community to continue contributing

kiran_panchavat
Influencer

@tawm_12 

We recently completed an integration for one of our customers using the following links:

 

https://stackoverflow.com/questions/53287922/how-to-forward-application-logs-to-splunk-from-docker-c... 

https://www.splunk.com/en_us/blog/tips-and-tricks/splunk-logging-driver-for-docker.html?_gl=1*1tdlq7... 

Did this help? If yes, please consider giving kudos, marking it as the solution, or commenting for clarification — your feedback keeps the community going!
Get Updates on the Splunk Community!

Say goodbye to manually analyzing phishing and malware threats with Splunk Attack ...

In today’s evolving threat landscape, we understand you’re constantly bombarded with phishing and malware ...

AppDynamics is now part of Splunk Ideas

Hello Splunkers, We have exciting news for you! AppDynamics has been added to the Splunk Ideas Portal. Which ...

Advanced Splunk Data Management Strategies

Join us on Wednesday, May 14, 2025, at 11 AM PDT / 2 PM EDT for an exclusive Tech Talk that delves into ...