Monitoring Splunk

Does anyone know the base recommended amount of disk space for Splunk Enterprise exclusive of indexed data?

vanderaj2
Path Finder

Hello,

My server operations team is standing up a set of Splunk servers for me to set up a distributed Splunk deployment. FYI - we are keeping a separate path for storage of indexed data. They want to know how much space should be provisioned just to support the base Splunk software, exclusive of the indexed data.

I did some research into Splunk reference hardware. For search heads, I see that the recommendation is:

2 x 300GB, 10,000 RPM SAS hard disks, configured in RAID 1

Is 300 GB worth of space the rule of thumb I should be using for $SPLUNK_HOME?

Thanks!

0 Karma
1 Solution

somesoni2
Revered Legend

The 300GB would be recommended by Search Heads with large number of user/user searches (required for dispatch directory and configs mostly). Other components (Intermediate Forwarders/Deployment Servers/SHC Deployers) would not have that much space requirements and probably 100 GB would be more that sufficient. For indexers also, excluding the $Splunk_Home/var/lib (index storage), 100 GB would be more than sufficient.

View solution in original post

cstump_splunk
Splunk Employee
Splunk Employee

For indexed data, you will want to calculate you anticipated capacity. Start by determining how much you will index each day. Then take into account your retention settings.

For instance, let's say that you will be ingesting a 1 GB of data each day and that you want a retention period of 2 years. This means that in two years you will be storing 730GB of data.

I would get some estimates on what you expect/ need and go from there.

adonio
Ultra Champion

Splunk has good compression capability, avg 50%
also things get another angle with ES or ITSI or any major use of DM Acceleration
in general, this is the basic formulas Splunk uses, frmo docs:
http://docs.splunk.com/Documentation/Splunk/7.1.2/Capacity/HowSplunkcalculatesdiskstorage

somesoni2
Revered Legend

The 300GB would be recommended by Search Heads with large number of user/user searches (required for dispatch directory and configs mostly). Other components (Intermediate Forwarders/Deployment Servers/SHC Deployers) would not have that much space requirements and probably 100 GB would be more that sufficient. For indexers also, excluding the $Splunk_Home/var/lib (index storage), 100 GB would be more than sufficient.

vanderaj2
Path Finder

Thank you! That is very helpful....

0 Karma

adonio
Ultra Champion

@vanderaj2 if it answers your question, kindly accept the answer for others to see it worked for you

0 Karma
Get Updates on the Splunk Community!

Introducing Edge Processor: Next Gen Data Transformation

We get it - not only can it take a lot of time, money and resources to get data into Splunk, but it also takes ...

Take the 2021 Splunk Career Survey for $50 in Amazon Cash

Help us learn about how Splunk has impacted your career by taking the 2021 Splunk Career Survey. Last year’s ...

Using Machine Learning for Hunting Security Threats

WATCH NOW Seeing the exponential hike in global cyber threat spectrum, organizations are now striving more for ...