Deployment Architecture

What is the calculation used to determine Archived (Frozen) storage requirements to be 202.5GB?

muradgh
Path Finder

Hi Splunkers,

In the "Architecting Splunk 8.0.1 Enterprise Deployments" coursework, we have been given a data sizing sheet to calculate everything in the coursework, but this sheet does not cover the frozen requirements.

I have tested one of the examples that we had on the "Splunk sizing" website:
https://splunk-sizing.appspot.com/
And it matches what was in the data sizing sheet, but I need to introduce the "frozen" to the table.

What I have done to validate the calculations is as follows:
For example, I have assumed:
Daily Data Volume = 5GB
Raw Compression Factor = 0.15
Metadata Size Factor = 0.35
Number of Days in Hot/Warm = 30 days
Number of Days in Cold = 60 days

muradgh_0-1680696417520.png

Then for testing, I increased the Archived (Frozen) slider from 0 days to 9 months and then found that the "Archived" storage requirement is now 202.5 GB.
My question is what is the calculation used here to determine the "Archived" storage requirement to be 202.5 GB in this case? 

muradgh_1-1680697192821.png

Thank you in advance.



Labels (1)
0 Karma
1 Solution

scelikok
SplunkTrust
SplunkTrust

Hi @muradgh,

Frozen path keeps only compressed raw data. That is why the calculation is done using below formula;

DailyDataVolume = 5GB
RawCompressionFactor = 0.15
NumberofDaysinFrozen = 9 months = 270 days

ArchivedStorageSpace = DailyDataVolume* RawCompressionFactor* NumberofDaysinFrozen

ArchivedStorageSpace = 5 * 0.15 * 270 = 202.5 GB

Please keep in mind that Splunk does not manage the Archive path. It just moves raw data to that path.  There is no retention check. This calculation is only for you to provide enough space. You should manually clean data older than 9 months. 

 

If this reply helps you an upvote and "Accept as Solution" is appreciated.

View solution in original post

scelikok
SplunkTrust
SplunkTrust

Hi @muradgh,

Frozen path keeps only compressed raw data. That is why the calculation is done using below formula;

DailyDataVolume = 5GB
RawCompressionFactor = 0.15
NumberofDaysinFrozen = 9 months = 270 days

ArchivedStorageSpace = DailyDataVolume* RawCompressionFactor* NumberofDaysinFrozen

ArchivedStorageSpace = 5 * 0.15 * 270 = 202.5 GB

Please keep in mind that Splunk does not manage the Archive path. It just moves raw data to that path.  There is no retention check. This calculation is only for you to provide enough space. You should manually clean data older than 9 months. 

 

If this reply helps you an upvote and "Accept as Solution" is appreciated.

muradgh
Path Finder

Hi @scelikok 

Thank you very much.

0 Karma
Get Updates on the Splunk Community!

.conf24 | Day 0

Hello Splunk Community! My name is Chris, and I'm based in Canberra, Australia's capital, and I travelled for ...

Enhance Security Visibility with Splunk Enterprise Security 7.1 through Threat ...

(view in My Videos)Struggling with alert fatigue, lack of context, and prioritization around security ...

Troubleshooting the OpenTelemetry Collector

  In this tech talk, you’ll learn how to troubleshoot the OpenTelemetry collector - from checking the ...