Deployment Architecture

What is the calculation used to determine Archived (Frozen) storage requirements to be 202.5GB?

muradgh
Path Finder

Hi Splunkers,

In the "Architecting Splunk 8.0.1 Enterprise Deployments" coursework, we have been given a data sizing sheet to calculate everything in the coursework, but this sheet does not cover the frozen requirements.

I have tested one of the examples that we had on the "Splunk sizing" website:
https://splunk-sizing.appspot.com/
And it matches what was in the data sizing sheet, but I need to introduce the "frozen" to the table.

What I have done to validate the calculations is as follows:
For example, I have assumed:
Daily Data Volume = 5GB
Raw Compression Factor = 0.15
Metadata Size Factor = 0.35
Number of Days in Hot/Warm = 30 days
Number of Days in Cold = 60 days

muradgh_0-1680696417520.png

Then for testing, I increased the Archived (Frozen) slider from 0 days to 9 months and then found that the "Archived" storage requirement is now 202.5 GB.
My question is what is the calculation used here to determine the "Archived" storage requirement to be 202.5 GB in this case? 

muradgh_1-1680697192821.png

Thank you in advance.



Labels (1)
0 Karma
1 Solution

scelikok
SplunkTrust
SplunkTrust

Hi @muradgh,

Frozen path keeps only compressed raw data. That is why the calculation is done using below formula;

DailyDataVolume = 5GB
RawCompressionFactor = 0.15
NumberofDaysinFrozen = 9 months = 270 days

ArchivedStorageSpace = DailyDataVolume* RawCompressionFactor* NumberofDaysinFrozen

ArchivedStorageSpace = 5 * 0.15 * 270 = 202.5 GB

Please keep in mind that Splunk does not manage the Archive path. It just moves raw data to that path.  There is no retention check. This calculation is only for you to provide enough space. You should manually clean data older than 9 months. 

 

If this reply helps you an upvote and "Accept as Solution" is appreciated.

View solution in original post

scelikok
SplunkTrust
SplunkTrust

Hi @muradgh,

Frozen path keeps only compressed raw data. That is why the calculation is done using below formula;

DailyDataVolume = 5GB
RawCompressionFactor = 0.15
NumberofDaysinFrozen = 9 months = 270 days

ArchivedStorageSpace = DailyDataVolume* RawCompressionFactor* NumberofDaysinFrozen

ArchivedStorageSpace = 5 * 0.15 * 270 = 202.5 GB

Please keep in mind that Splunk does not manage the Archive path. It just moves raw data to that path.  There is no retention check. This calculation is only for you to provide enough space. You should manually clean data older than 9 months. 

 

If this reply helps you an upvote and "Accept as Solution" is appreciated.

muradgh
Path Finder

Hi @scelikok 

Thank you very much.

0 Karma
Get Updates on the Splunk Community!

Accelerating Observability as Code with the Splunk AI Assistant

We’ve seen in previous posts what Observability as Code (OaC) is and how it’s now essential for managing ...

Integrating Splunk Search API and Quarto to Create Reproducible Investigation ...

 Splunk is More Than Just the Web Console For Digital Forensics and Incident Response (DFIR) practitioners, ...

Congratulations to the 2025-2026 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...