Deployment Architecture

migrate an existing indexer cluster to smart store

sx
Engager

Hi everyone,

Long story in short.

I am planning to migrate our Splunk Cluster from public cloud to on-prem with all the old data existing in the cloud, but transfer them from local storage to smart store, the new data will be streaming to the on-prem cluster with all the configuration (index name, users, apps, reports, alerts, dashboard, etc) unchanged, and we will keep the minimum "in-cloud cluster" up and running until the data aged out. that's why we want to move the data from local storage to smart store for cost saving

Now, I have two requirement:

1 rename the index name when data is migrated to smart store, this will be used in case we need to "hook up" it with our new on-prem cluster, so we need the index name to be different then their previous name.

2 we have a few indexes were configured "maxDataSize = auto_high_volume", from smartstore document, it seems that we can only use "maxDataSize=auto", even if we re-config this to "auto", it won't re-size the existing buckets from 10G to 750M, my question is is there any way for us to just move these bucket into the smartstore, the purpose for us is just to retain these data until they expire, there won't be active search on these data.

Thank you

Labels (2)
0 Karma
Get Updates on the Splunk Community!

How to Get Started with Splunk Data Management Pipeline Builders (Edge Processor & ...

If you want to gain full control over your growing data volumes, check out Splunk’s Data Management pipeline ...

Out of the Box to Up And Running - Streamlined Observability for Your Cloud ...

  Tech Talk Streamlined Observability for Your Cloud Environment Register    Out of the Box to Up And Running ...

Splunk Smartness with Brandon Sternfield | Episode 3

Hello and welcome to another episode of "Splunk Smartness," the interview series where we explore the power of ...