- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi All,
Splunk indexer cluster index replication: how many indexes can Splunk handle?
We have 300 indexes.
Is there any limitation about index replication?
tks~:)
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content


There's no hard limit really. But it is a big dependent on the total amount of data you are indexing and your disks along with the efficiency of your indexing strategy, including autoLB, rep and search factor. I've encountered TB+ a day customers with over 1000 indexes.
If you are actively indexing data to every single index, then Splunk may have a hard time keeping up with all of the sourcetyping parsing. CPU time is another possible bottleneck, so monitoring of cpu time is also recommended. Offloading parsing to an aggregation tier is also recommended..
Here is a good starting point : http://docs.splunk.com/Documentation/Splunk/6.3.2/Capacity/DimensionsofaSplunkEnterprisedeployment
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content


There's no hard limit really. But it is a big dependent on the total amount of data you are indexing and your disks along with the efficiency of your indexing strategy, including autoLB, rep and search factor. I've encountered TB+ a day customers with over 1000 indexes.
If you are actively indexing data to every single index, then Splunk may have a hard time keeping up with all of the sourcetyping parsing. CPU time is another possible bottleneck, so monitoring of cpu time is also recommended. Offloading parsing to an aggregation tier is also recommended..
Here is a good starting point : http://docs.splunk.com/Documentation/Splunk/6.3.2/Capacity/DimensionsofaSplunkEnterprisedeployment
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
tks for your answer, it helps a lot.
