Getting Data In

What will be the size of the indexed data if I send 50GB of raw data to Splunk?

splunker12er
Motivator

I send daily 50 GB raw data from my machines to Splunk for indexing
what will be the size of the data after it got indexed ?

Will this be the same 50 Gb or indexed data will have less size ?

MuS
Legend

Hi splunker12er,

it all depends on your raw data, but basically you can say compression between 30-50% are normal, you can check this with this search:

 | dbinspect index=YOURINDEX
 | fields state,id,rawSize,sizeOnDiskMB 
 | stats sum(rawSize) AS rawTotal, sum(sizeOnDiskMB) AS diskTotalinMB
 | eval rawTotalinMB=(rawTotal / 1024 / 1024) | fields - rawTotal
 | eval compression=tostring(round(diskTotalinMB / rawTotalinMB * 100, 2)) + "%"
 | table rawTotalinMB, diskTotalinMB, compression

cheers,

MuS

Get Updates on the Splunk Community!

Webinar Recap | Revolutionizing IT Operations: The Transformative Power of AI and ML ...

The Transformative Power of AI and ML in Enhancing Observability   In the realm of IT operations, the ...

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...