Deployment Architecture

Restoring only selected indexes, backup best practices

Branden
Builder

Hello. We are planning an upgrade from Splunk 7.2.1 to Splunk 8.0.x. For best practices, we'd like to back up our indexes prior to the upgrade, but we're having an issue: our backup solutions can never keep up with the amount of incoming data, and we can't (won't) turn Splunk off for a long enough time to freeze incoming traffic for a full backup. It's many terabytes of data...

Some indexes are more important than others, so I was kicking around backing up just the 'critical' indexes. If we upgrade and the indexes become corrupted, we could still restore the important indexes.

Is that sound logic? Can indexes be restored piece-meal like that?

Our environment is simple: one Indexer/Search head with a bunch of UFs.

Also, if anyone has any recommendations for backing up Splunk indexes, please let me know. I've read Splunk's docs as well as posts on here on how to do it but, as I mentioned, the backups just can't keep up with the incoming stream.

Thanks!

0 Karma

richgalloway
SplunkTrust
SplunkTrust

Are your indexers clustered?

---
If this reply helps you, Karma would be appreciated.
0 Karma

Branden
Builder

They are not. Simple environment. One indexer, bunch of forwarders. I'll update the original post with that info.

0 Karma
Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Dynamic formatting from XML events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Enter the Agentic Era with Splunk AI Assistant for SPL 1.4

  🚀 Your data just got a serious AI upgrade — are you ready? Say hello to the Agentic Era with the ...

Stronger Security with Federated Search for S3, GCP SQL & Australian Threat ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...