Deployment Architecture

What is the calculation used to determine Archived (Frozen) storage requirements to be 202.5GB?

muradgh
Path Finder

Hi Splunkers,

In the "Architecting Splunk 8.0.1 Enterprise Deployments" coursework, we have been given a data sizing sheet to calculate everything in the coursework, but this sheet does not cover the frozen requirements.

I have tested one of the examples that we had on the "Splunk sizing" website:
https://splunk-sizing.appspot.com/
And it matches what was in the data sizing sheet, but I need to introduce the "frozen" to the table.

What I have done to validate the calculations is as follows:
For example, I have assumed:
Daily Data Volume = 5GB
Raw Compression Factor = 0.15
Metadata Size Factor = 0.35
Number of Days in Hot/Warm = 30 days
Number of Days in Cold = 60 days

muradgh_0-1680696417520.png

Then for testing, I increased the Archived (Frozen) slider from 0 days to 9 months and then found that the "Archived" storage requirement is now 202.5 GB.
My question is what is the calculation used here to determine the "Archived" storage requirement to be 202.5 GB in this case? 

muradgh_1-1680697192821.png

Thank you in advance.



Labels (1)
0 Karma
1 Solution

scelikok
SplunkTrust
SplunkTrust

Hi @muradgh,

Frozen path keeps only compressed raw data. That is why the calculation is done using below formula;

DailyDataVolume = 5GB
RawCompressionFactor = 0.15
NumberofDaysinFrozen = 9 months = 270 days

ArchivedStorageSpace = DailyDataVolume* RawCompressionFactor* NumberofDaysinFrozen

ArchivedStorageSpace = 5 * 0.15 * 270 = 202.5 GB

Please keep in mind that Splunk does not manage the Archive path. It just moves raw data to that path.  There is no retention check. This calculation is only for you to provide enough space. You should manually clean data older than 9 months. 

 

If this reply helps you an upvote and "Accept as Solution" is appreciated.

View solution in original post

scelikok
SplunkTrust
SplunkTrust

Hi @muradgh,

Frozen path keeps only compressed raw data. That is why the calculation is done using below formula;

DailyDataVolume = 5GB
RawCompressionFactor = 0.15
NumberofDaysinFrozen = 9 months = 270 days

ArchivedStorageSpace = DailyDataVolume* RawCompressionFactor* NumberofDaysinFrozen

ArchivedStorageSpace = 5 * 0.15 * 270 = 202.5 GB

Please keep in mind that Splunk does not manage the Archive path. It just moves raw data to that path.  There is no retention check. This calculation is only for you to provide enough space. You should manually clean data older than 9 months. 

 

If this reply helps you an upvote and "Accept as Solution" is appreciated.

muradgh
Path Finder

Hi @scelikok 

Thank you very much.

0 Karma
Get Updates on the Splunk Community!

Observe and Secure All Apps with Splunk

  Join Us for Our Next Tech Talk: Observe and Secure All Apps with SplunkAs organizations continue to innovate ...

Splunk Decoded: Business Transactions vs Business IQ

It’s the morning of Black Friday, and your e-commerce site is handling 10x normal traffic. Orders are flowing, ...

Fastest way to demo Observability

I’ve been having a lot of fun learning about Kubernetes and Observability. I set myself an interesting ...