Security

Splunk Docker container error "cannot create /opt/container_artifact/splunk-container.state: Permission denied"?

Graham_Hanningt
Builder

I had been successfully using a custom Dockerfile to create a Docker container based on the Splunk-provided Docker image for Splunk 7.2.0:

from splunk/splunk:7.2.0

user root

# Do custom stuff...

user ${SPLUNK_USER}

# Do more custom stuff...

(With apologies for being coy about the "custom stuff".)

I wanted to upgrade to Splunk 7.3.0, so I updated the FROM command to refer to the 7.3.0 tag.

That introduced the following error:

sh: 1: cannot create /opt/container_artifact/splunk-container.state: Permission denied

What changed between 7.2.0 and 7.3.0 to cause this error?

0 Karma
1 Solution

Graham_Hanningt
Builder

Answering my own question, in case it helps someone else...

The error was caused by the introduction of an ansible user into the base Splunk Docker image.

My custom Dockerfile was setting the user to splunk (or rather, the user specified by the corresponding environment variable). That caused a problem, because an updated shell script in the base Docker image was subsequently attempting to write to a file that the ansible user could write to, but the splunk user couldn't.

I fixed the problem by changing the following line in my Dockerfile:

user ${SPLUNK_USER}

to:

user ${ANSIBLE_USER}

I might have been able to solve the problem by removing all USER commands from my Dockerfile, and inserting sudo in front of the apt-get command in the "custom stuff" in that Dockerfile. However, after reading, but not fully understanding, some topics on the web that recommended against using sudo in a Dockerfile (I'm neither a Unix expert nor a Docker expert), I decided against that approach.

View solution in original post

0 Karma

Graham_Hanningt
Builder

Answering my own question, in case it helps someone else...

The error was caused by the introduction of an ansible user into the base Splunk Docker image.

My custom Dockerfile was setting the user to splunk (or rather, the user specified by the corresponding environment variable). That caused a problem, because an updated shell script in the base Docker image was subsequently attempting to write to a file that the ansible user could write to, but the splunk user couldn't.

I fixed the problem by changing the following line in my Dockerfile:

user ${SPLUNK_USER}

to:

user ${ANSIBLE_USER}

I might have been able to solve the problem by removing all USER commands from my Dockerfile, and inserting sudo in front of the apt-get command in the "custom stuff" in that Dockerfile. However, after reading, but not fully understanding, some topics on the web that recommended against using sudo in a Dockerfile (I'm neither a Unix expert nor a Docker expert), I decided against that approach.

0 Karma
Get Updates on the Splunk Community!

Advanced Splunk Data Management Strategies

Join us on Wednesday, May 14, 2025, at 11 AM PDT / 2 PM EDT for an exclusive Tech Talk that delves into ...

Uncovering Multi-Account Fraud with Splunk Banking Analytics

Last month, I met with a Senior Fraud Analyst at a nationally recognized bank to discuss their recent success ...

Secure Your Future: A Deep Dive into the Compliance and Security Enhancements for the ...

What has been announced?  In the blog, “Preparing your Splunk Environment for OpensSSL3,”we announced the ...