Deployment Architecture

500GB day license - HW needed and best practice "N" of Search Heads/indexers

jbanAtSplunk
Communicator

Hi, if we upgrade license to 500GB.

What is best practice Hardware architecture (CPU +RAM) based and number of "N" Search Heads, "N" indexers.

How much storage per indexer we need if let's say retention is 30 days and "N" installed indexers.

or if at least you can share where is good .pdf for me to read with those answers.

Thank you.

Labels (1)
0 Karma
1 Solution

gcusello
SplunkTrust
SplunkTrust

Hi @jbanAtSplunk,

this means that you require more Indexers: at least 5.

About Storage, if the RF and SF is 2, you have 5 Indexers and you'll have Contingency=10%, you'll have:

Total_Storage = (License*Retention*0.5*SF) (1 + Contingency) + License*3.4 = (500*30*0.5*2)*1.1 +1700 = 18200

Storage per Indexer = 18200/5 = 3640 GB per Indexer

(License*3.4 is datamodels' storage for ES)

Ciao.

Giuseppe

View solution in original post

gcusello
SplunkTrust
SplunkTrust

Hi @jbanAtSplunk,

this isn't a question for the Community but for a Splunk Architect.

Anyway, there are many other parameters to answer to your question:

  • is there an Indexer Cluster,
  • if yes what's the Search Factor and The Replication Factor?
  • is there a Search Head Cluster,
  • are there Premium App as Enterprise Security or ITSI?
  • how many concurrent users you foresee in the system?
  • are there scheduled searches?

Anyway, if you don't have ES or ITSI, you couls use around 3 Indexers.

If you don't have a Search Head Cluster you can use one Search Head, if you have a Search Head Cluster you need at least three SHs and a Deployer,

If you have an Indexer Cluster you need at least 3 Indexers and one Cluster Manager.

If you have ES or ITSI the resources are completely different!

For storage: if you don't have an Indexer Cluster you could consider:

Storage = License*retention*0.5 = 500*30*0.5 = 7500 GB

If you have an indexer Cluster the required storage depends on the above factors.

About CPUs and RAMs: they depends on:

  • presence of Premium App,
  • number of concurrent users
  • number of scheduled searches,

so I cannot help you without these information, the only hint is to see at this url the reference hardware: https://docs.splunk.com/Documentation/Splunk/9.1.1/Capacity/Referencehardware 

Ciao.

Giuseppe

0 Karma

jbanAtSplunk
Communicator

Will check reference.

We already have 1 X SH, 1 X ESS, 2 x indexers in cluster, 1 x Deployment server.

But license was 4 times smaller, now as we will expand license I am looking what we need to expand (storage, cpu, ram) and how much.

probably, will go to 4 indexers (from 2) and will expand 2.5TB per indexer to 7.5TB per indexer.

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @jbanAtSplunk ,

storage on Indexer Cluster depends on Replication and Search Factor, what are they?

What's ESS?

have you Premium Apps?

Ciao.

Giuseppe

0 Karma

jbanAtSplunk
Communicator

Currently

[default]
repFactor = auto

Search factor is from default. so it's 2

 

ESS is Splunk Enterprise Security (on it's own SH), no Other Premium Apps

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @jbanAtSplunk,

this means that you require more Indexers: at least 5.

About Storage, if the RF and SF is 2, you have 5 Indexers and you'll have Contingency=10%, you'll have:

Total_Storage = (License*Retention*0.5*SF) (1 + Contingency) + License*3.4 = (500*30*0.5*2)*1.1 +1700 = 18200

Storage per Indexer = 18200/5 = 3640 GB per Indexer

(License*3.4 is datamodels' storage for ES)

Ciao.

Giuseppe

richgalloway
SplunkTrust
SplunkTrust

The recommended hardware does not change as ingestion changes.  Scale by adding instances rather than by adding resources.

The number of search heads is a function of the number of searches to run and the number of users to support.  The number of indexers is related to the rate of ingestion, but also must consider the number of searches to run (remember that indexers save data and search it).

Storage needs is not just the retention period times the amount ingested each day.  Consider also replication of data among indexers, datamodel accelerations (which can consume a lot of space), and data compression.

There is an app that can help.  See https://splunkbase.splunk.com/app/5176 and engage your Splunk account team as they are experts at this.

---
If this reply helps you, Karma would be appreciated.
Get Updates on the Splunk Community!

Enter the Splunk Community Dashboard Challenge for Your Chance to Win!

The Splunk Community Dashboard Challenge is underway! This is your chance to showcase your skills in creating ...

.conf24 | Session Scheduler is Live!!

.conf24 is happening June 11 - 14 in Las Vegas, and we are thrilled to announce that the conference catalog ...

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...