Deployment Architecture

Is it good practice to run an rsync script to take a backup of any new warm buckets created to a new partition?

kkossery
Communicator

I need to start backing up my Splunk and was looking at backing up any new Warm buckets. I'm planning to do this by running an rsync script to take a backup of any new warm bucket created to a new partition.
Is this a good practice?
I'm interested in knowing what other users are doing to backup their Splunk/indexes on Amazon EC2.

Thanks

Tags (4)
0 Karma

kkossery
Communicator

I've installed s3sync on the Splunk box which would sync buckets (hot/warm/cold) to the S3 storage.

0 Karma

Arkon
Explorer

watchout for your S3 policy in case it automatically removes files after some time

0 Karma

Yasaswy
Contributor

hi kkossery, In general I would believe this to be "not" a good practice. Mostly because it does not scale well and very config and env dependent. I would go with clustering to solve any of the availability requirements.

0 Karma

kkossery
Communicator

Thanks! I will wait on what others have to say on this.

0 Karma
Get Updates on the Splunk Community!

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...

Last Chance to Submit Your Paper For BSides Splunk - Deadline is August 12th!

Hello everyone! Don't wait to submit - The deadline is August 12th! We have truly missed the community so ...