Installation

How to Autocreate an index in a docker container?

dstromberg
Path Finder

 

Hi folks.

I'm attempting to run Splunk in a docker container.  Or rather, I have that working - it was pretty easy with docker-compose based on https://splunk.github.io/docker-splunk/EXAMPLES.html#create-standalone-from-compose

However, I want to create an index automatically, when the container first starts up.  This I'm finding difficult.

I've tried a variety of methods, but they all failed in one way or another:

  • yum and dnf are missing from the container, and microdnf appears to be broken.  This makes it difficult to customize the container's configuration.
  • The container so configured appears to be based on RHEL, and we don't have any RHEL entitlements.  This too makes it difficult to customize the container's behavior.
  • I tried setting up a volume and adding a script that would start splunk and shortly thereafter add the index, but I found that Splunk was missing lots of config files this way.  This may or may not be due to my relative inexperience with docker.
    • I invoked the script with the following in docker-compose.yml:
      • entrypoint: /bin/bash
      • command: /spunk-files/start 
    •  I needed to copy these files, which I didn't have to copy before the entrypoint+command change:
      • $SPLUNK_HOME/etc/splunk-launch.conf
      • $SPLUNK_HOME/etc/splunk.version
    • I also needed to create some logging directories, otherwise Splunk would fail to start.
  • One of my favorite troubleshooting techniques, using a system call tracer like "strace", wasn't working because I couldn't install it - see above under microdnf.

Does anyone know of a good way to auto-create a Splunk index at container creation time, without an RHEL entitlement?

Thanks!

 

Labels (5)
0 Karma
1 Solution

schose
Builder

Hi,

The docker container is started using splunk-ansible. You can configure some behavoirs of your container using environment variables, others using default.yml

https://github.com/splunk/splunk-ansible/blob/develop/docs/advanced/default.yml.spec.md

use parameter apps_location to install apps automatically at container startup.

https://github.com/splunk/splunk-ansible/blob/37149df811538d589ab4740284d3a7264314d41f/docs/advanced...

you can download them or present them on persistent storage. 

i would create an app with indexes.conf containing your index configuration and configure it there.

regards,

Andreas

View solution in original post

schose
Builder

Hi,

The docker container is started using splunk-ansible. You can configure some behavoirs of your container using environment variables, others using default.yml

https://github.com/splunk/splunk-ansible/blob/develop/docs/advanced/default.yml.spec.md

use parameter apps_location to install apps automatically at container startup.

https://github.com/splunk/splunk-ansible/blob/37149df811538d589ab4740284d3a7264314d41f/docs/advanced...

you can download them or present them on persistent storage. 

i would create an app with indexes.conf containing your index configuration and configure it there.

regards,

Andreas

schose
Builder

btw. here is an example for docker-compose using direct filesystem mapping

version: '3'
services:
  single:
    image: splunk/splunk:8.1.5
    ports:
      - "8111:8000"
    volumes:
      - single-etc:/opt/splunk/etc
      - single-var:/opt/splunk/var
      - /my/path/to/indexapp/indexapp:/opt/splunk/etc/apps/indexapp
    hostname: idx1
    environment:
      - SPLUNK_HOME=/opt/splunk/
     # - DEFAULTS_URL=http://splunk-defaults/default.yml
      - SPLUNK_START_ARGS="--accept-license"
      - SPLUNK_PASSWORD=EnterYourCreditCardNumber
      - SPLUNK_ROLE=splunk_standalone
      - SPLUNK_DEBUG="true"

volumes:
  single-etc:
  single-var:

networks:
  default:
    external:
      name: splunk

 

regards,

 

Andreas

0 Karma

smurf
Path Finder

Hi,

I haven't tried something like this yet. But I think creating a persistent storage with the index config could help. 

Data Storage ## | docker-splunk

0 Karma
Get Updates on the Splunk Community!

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...

Last Chance to Submit Your Paper For BSides Splunk - Deadline is August 12th!

Hello everyone! Don't wait to submit - The deadline is August 12th! We have truly missed the community so ...