Security

Splunk Docker container error "cannot create /opt/container_artifact/splunk-container.state: Permission denied"?

Graham_Hanningt
Builder

I had been successfully using a custom Dockerfile to create a Docker container based on the Splunk-provided Docker image for Splunk 7.2.0:

from splunk/splunk:7.2.0

user root

# Do custom stuff...

user ${SPLUNK_USER}

# Do more custom stuff...

(With apologies for being coy about the "custom stuff".)

I wanted to upgrade to Splunk 7.3.0, so I updated the FROM command to refer to the 7.3.0 tag.

That introduced the following error:

sh: 1: cannot create /opt/container_artifact/splunk-container.state: Permission denied

What changed between 7.2.0 and 7.3.0 to cause this error?

0 Karma
1 Solution

Graham_Hanningt
Builder

Answering my own question, in case it helps someone else...

The error was caused by the introduction of an ansible user into the base Splunk Docker image.

My custom Dockerfile was setting the user to splunk (or rather, the user specified by the corresponding environment variable). That caused a problem, because an updated shell script in the base Docker image was subsequently attempting to write to a file that the ansible user could write to, but the splunk user couldn't.

I fixed the problem by changing the following line in my Dockerfile:

user ${SPLUNK_USER}

to:

user ${ANSIBLE_USER}

I might have been able to solve the problem by removing all USER commands from my Dockerfile, and inserting sudo in front of the apt-get command in the "custom stuff" in that Dockerfile. However, after reading, but not fully understanding, some topics on the web that recommended against using sudo in a Dockerfile (I'm neither a Unix expert nor a Docker expert), I decided against that approach.

View solution in original post

0 Karma

Graham_Hanningt
Builder

Answering my own question, in case it helps someone else...

The error was caused by the introduction of an ansible user into the base Splunk Docker image.

My custom Dockerfile was setting the user to splunk (or rather, the user specified by the corresponding environment variable). That caused a problem, because an updated shell script in the base Docker image was subsequently attempting to write to a file that the ansible user could write to, but the splunk user couldn't.

I fixed the problem by changing the following line in my Dockerfile:

user ${SPLUNK_USER}

to:

user ${ANSIBLE_USER}

I might have been able to solve the problem by removing all USER commands from my Dockerfile, and inserting sudo in front of the apt-get command in the "custom stuff" in that Dockerfile. However, after reading, but not fully understanding, some topics on the web that recommended against using sudo in a Dockerfile (I'm neither a Unix expert nor a Docker expert), I decided against that approach.

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

May 2026 Splunk Expert Sessions: Security & Observability

Level Up Your Operations: May 2026 Splunk Expert Sessions Whether you are refining your security posture or ...

Network to App: Observability Unlocked [May & June Series]

In today’s digital landscape, your environment is no longer confined to the data center. It spans complex ...

SPL2 Deep Dives, AppDynamics Integrations, SAML Made Simple and Much More on Splunk ...

Splunk Lantern is Splunk’s customer success center that provides practical guidance from Splunk experts on key ...