Monitoring Splunk

Does anyone know the base recommended amount of disk space for Splunk Enterprise exclusive of indexed data?

vanderaj2
Path Finder

Hello,

My server operations team is standing up a set of Splunk servers for me to set up a distributed Splunk deployment. FYI - we are keeping a separate path for storage of indexed data. They want to know how much space should be provisioned just to support the base Splunk software, exclusive of the indexed data.

I did some research into Splunk reference hardware. For search heads, I see that the recommendation is:

2 x 300GB, 10,000 RPM SAS hard disks, configured in RAID 1

Is 300 GB worth of space the rule of thumb I should be using for $SPLUNK_HOME?

Thanks!

0 Karma
1 Solution

somesoni2
Revered Legend

The 300GB would be recommended by Search Heads with large number of user/user searches (required for dispatch directory and configs mostly). Other components (Intermediate Forwarders/Deployment Servers/SHC Deployers) would not have that much space requirements and probably 100 GB would be more that sufficient. For indexers also, excluding the $Splunk_Home/var/lib (index storage), 100 GB would be more than sufficient.

View solution in original post

cstump_splunk
Splunk Employee
Splunk Employee

For indexed data, you will want to calculate you anticipated capacity. Start by determining how much you will index each day. Then take into account your retention settings.

For instance, let's say that you will be ingesting a 1 GB of data each day and that you want a retention period of 2 years. This means that in two years you will be storing 730GB of data.

I would get some estimates on what you expect/ need and go from there.

adonio
Ultra Champion

Splunk has good compression capability, avg 50%
also things get another angle with ES or ITSI or any major use of DM Acceleration
in general, this is the basic formulas Splunk uses, frmo docs:
http://docs.splunk.com/Documentation/Splunk/7.1.2/Capacity/HowSplunkcalculatesdiskstorage

somesoni2
Revered Legend

The 300GB would be recommended by Search Heads with large number of user/user searches (required for dispatch directory and configs mostly). Other components (Intermediate Forwarders/Deployment Servers/SHC Deployers) would not have that much space requirements and probably 100 GB would be more that sufficient. For indexers also, excluding the $Splunk_Home/var/lib (index storage), 100 GB would be more than sufficient.

vanderaj2
Path Finder

Thank you! That is very helpful....

0 Karma

adonio
Ultra Champion

@vanderaj2 if it answers your question, kindly accept the answer for others to see it worked for you

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Can’t Make It to Boston? Stream .conf25 and Learn with Haya Husain

Boston may be buzzing this September with Splunk University and .conf25, but you don’t have to pack a bag to ...

Splunk Lantern’s Guide to The Most Popular .conf25 Sessions

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Unlock What’s Next: The Splunk Cloud Platform at .conf25

In just a few days, Boston will be buzzing as the Splunk team and thousands of community members come together ...