Splunk Enterprise Security

Indexer Volumes(Disks) in a Smartstore configuration


Hi All,

It is recommended to use the i3.8xlarge instance type which comes with ephemeral storage for Splunk indexers if leveraging Smartstore for remote storage (per the Deploying Splunk Enterprise on Amazon Web Services tech note by Splunk). This ephemeral storage as I understand will hold the cached storage. What I’m trying to understand is how the good people here have set up there indexer to leverage SmartStore (S3) while also using an ephemeral disk(if at all) for local cache since the non-cache data (e.g., config files in /opt/splunk will be lost on a restart or reboot of the server).
- Are folks attaching an EBS volume for the indexer configuration? I feel like an attached EBS volume will undercut the cost saving of going the route of a smartstore somewhat
- Are they leveraging automation to accomplish a rebuild of the server each time it is restarted/rebuilt?
- What does your indexer setup look like while using SmartStore (i.e. Servertype (e.g AWS Server Type, Storage Volume(s), remote storage type

That’s the hole in my understanding as of the moment. Any clarification is highly appreciated.


Splunker Next Door.

New Member

You likely would not put /opt/splunk on the ephemeral volumes of the instance. You can have multiple EBS volumes in additional to the ephemeral volumes that come with the i3.8xlarge instance. Put /opt/splunk on an EBS volume, and the indexer smartstore cache only on an ephemeral volume. What I am currently struggling with is a good script to mount the ephemeral volume on RHEL when the service starts and/or the host starts. Adding script to ExecPreStart command of Systemd service works most of the time, but not smooth enough. Any good script out there would be great to see.

0 Karma



This is quite long story and there are couple of ways to do it.

Before anyone answer it here you could try to look answer from https://splunk-usergroups.slack.com/archives/CD6JNQ03F. Unfortunately there is no only one part of this channel which you need to read, instead of it has spread over all time this channel. 

Those i* nodes are good for smart store, but it depends what kind of architecture you have and how you are managing and automate it. I propose that you will contact someone Splunk and AWS specialist who have done this earlier as there are many places where you could fail (read that channel and you see at least some of those). It's not enough that this person is AWS guru or Splunk guru. He/she must understand both environments to get this working.

r. Ismo

0 Karma
Get Updates on the Splunk Community!

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

Get the T-shirt to Prove You Survived Splunk University Bootcamp

As if Splunk University, in Las Vegas, in-person, with three days of bootcamps and labs weren’t enough, now ...

Wondering How to Build Resiliency in the Cloud?

IT leaders are choosing Splunk Cloud as an ideal cloud transformation platform to drive business resilience,  ...