Migrating data to SplunkStore


We're retiring our internally hosted Splunk environment and moving the data into an EC2 instance on AWS. It seems like our best solution is to use SmartStore and I'm trying to determine the best way to migrate our data.

  1. We're moving multiple TB of logs
  2. Once that data is in S3, we won't be adding any new logs to Splunk.
  3. We would like the old data searchable.
  4. We will be reducing our Indexer count for 7 down to 2 as this environment will be minimally accessed.

I believe the best solution is to enable SmartStore on our servers and once the data is transferred to S3, create the new indexers and decommission our old environment. Am I missing something with this plan?

Labels (1)
0 Karma
Get Updates on the Splunk Community!

Splunk Cloud | Empowering Splunk Administrators with Admin Config Service (ACS)

Greetings, Splunk Cloud Admins and Splunk enthusiasts! The Admin Configuration Service (ACS) team is excited ...

Tech Talk | One Log to Rule Them All

One log to rule them all: how you can centralize your troubleshooting with Splunk logs We know how important ...

Splunk Security Content for Threat Detection & Response, Q1 Roundup

Join Principal Threat Researcher, Michael Haag, as he walks through: An introduction to the Splunk Threat ...