Deployment Architecture

Is it possible to limit bucket replication with limits.conf?

mudragada
Path Finder

We recently set up a multisite with replication between the sites.
This is causing network congestion when it comes to replicating the buckets. Is there a way to limit this using something like the limits.conf?

0 Karma

sowings
Splunk Employee
Splunk Employee

No. Splunk's data streaming and "fixup" activity in the case of failure is designed to return the cluster to a healthy state as soon as possible. For "live data" streaming (we call this hot buckets), you're sending a copy of the data "slice" (~128kb by default) from the source indexer to as many peers as required to meet the replication factor. When you've got downtime or another event that requires "fixing" a cluster, you can throttle the number of jobs (that is, active simultaneous attempts to copy the data), but not the bandwidth consumed.

Unfortunately, if you're counting bytes on the WAN, you may not be ready for multi-site clustering.

0 Karma
Get Updates on the Splunk Community!

Fueling your curiosity with new Splunk ILT and eLearning courses

At Splunk Education, we’re driven by curiosity—both ours and yours! That’s why we’re committed to delivering ...

Splunk AI Assistant for SPL 1.1.0 | Now Personalized to Your Environment for Greater ...

Splunk AI Assistant for SPL has transformed how users interact with Splunk, making it easier than ever to ...

Unleash Unified Security and Observability with Splunk Cloud Platform

     Now Available on Microsoft AzureOn Demand Now Step boldly into the AI revolution with enhanced security ...