<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: compression rate of indexed data: 50gig/day in 3 weeks uses 100gig HDD space in Splunk Search</title>
    <link>https://community.splunk.com/t5/Splunk-Search/compression-rate-of-indexed-data-50gig-day-in-3-weeks-uses/m-p/47348#M179434</link>
    <description>&lt;P&gt;Okay, thanks for the answer. Compressionrate is 21% which seems pretty good.&lt;/P&gt;</description>
    <pubDate>Thu, 30 Aug 2012 08:57:40 GMT</pubDate>
    <dc:creator>jan_wohlers</dc:creator>
    <dc:date>2012-08-30T08:57:40Z</dc:date>
    <item>
      <title>compression rate of indexed data: 50gig/day in 3 weeks uses 100gig HDD space</title>
      <link>https://community.splunk.com/t5/Splunk-Search/compression-rate-of-indexed-data-50gig-day-in-3-weeks-uses/m-p/47346#M179432</link>
      <description>&lt;P&gt;Hey,&lt;/P&gt;

&lt;P&gt;we just set up a indexer 3 weeks ago. By now we are indexing about 50gig/24h. If I go to Manager -&amp;gt; Indexes I can see that our main index only has a size of about 100gigs. Mostly just eventlogs will be indexed. Is there such a good compression that about 20 days of 50gigs/day will be stored in an 100gig index?&lt;/P&gt;

&lt;P&gt;Thanks for your answer in advance!&lt;/P&gt;

&lt;P&gt;Jan&lt;/P&gt;</description>
      <pubDate>Thu, 30 Aug 2012 08:27:35 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/compression-rate-of-indexed-data-50gig-day-in-3-weeks-uses/m-p/47346#M179432</guid>
      <dc:creator>jan_wohlers</dc:creator>
      <dc:date>2012-08-30T08:27:35Z</dc:date>
    </item>
    <item>
      <title>Re: compression rate of indexed data: 50gig/day in 3 weeks uses 100gig HDD space</title>
      <link>https://community.splunk.com/t5/Splunk-Search/compression-rate-of-indexed-data-50gig-day-in-3-weeks-uses/m-p/47347#M179433</link>
      <description>&lt;P&gt;Hi jan.wohlers&lt;/P&gt;

&lt;P&gt;basically you can say compression between 40-50% are normal, you can check this with this search:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;| dbinspect index=_internal
| fields state,id,rawSize,sizeOnDiskMB 
| stats sum(rawSize) AS rawTotal, sum(sizeOnDiskMB) AS diskTotalinMB
| eval rawTotalinMB=(rawTotal / 1024 / 1024) | fields - rawTotal
| eval compression=tostring(round(diskTotalinMB / rawTotalinMB * 100, 2)) + "%"
| table rawTotalinMB, diskTotalinMB, compression
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;cheers,&lt;/P&gt;

&lt;P&gt;MuS&lt;/P&gt;</description>
      <pubDate>Thu, 30 Aug 2012 08:47:22 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/compression-rate-of-indexed-data-50gig-day-in-3-weeks-uses/m-p/47347#M179433</guid>
      <dc:creator>MuS</dc:creator>
      <dc:date>2012-08-30T08:47:22Z</dc:date>
    </item>
    <item>
      <title>Re: compression rate of indexed data: 50gig/day in 3 weeks uses 100gig HDD space</title>
      <link>https://community.splunk.com/t5/Splunk-Search/compression-rate-of-indexed-data-50gig-day-in-3-weeks-uses/m-p/47348#M179434</link>
      <description>&lt;P&gt;Okay, thanks for the answer. Compressionrate is 21% which seems pretty good.&lt;/P&gt;</description>
      <pubDate>Thu, 30 Aug 2012 08:57:40 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/compression-rate-of-indexed-data-50gig-day-in-3-weeks-uses/m-p/47348#M179434</guid>
      <dc:creator>jan_wohlers</dc:creator>
      <dc:date>2012-08-30T08:57:40Z</dc:date>
    </item>
    <item>
      <title>Re: compression rate of indexed data: 50gig/day in 3 weeks uses 100gig HDD space</title>
      <link>https://community.splunk.com/t5/Splunk-Search/compression-rate-of-indexed-data-50gig-day-in-3-weeks-uses/m-p/47349#M179435</link>
      <description>&lt;P&gt;Thank you for this handy search example!&lt;BR /&gt;
I've couple of questions regarding it:&lt;BR /&gt;
1) how can i build a search which will give me a table of all indexes present wih compression ratio information? I tried this:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;| dbinspect index=*
  | fields state,id,rawSize,sizeOnDiskMB 
  | stats sum(rawSize) AS rawTotal, sum(sizeOnDiskMB) AS diskTotalinMB by index
  | eval rawTotalinMB=(rawTotal / 1024 / 1024) | fields - rawTotal
5.  | eval compression=tostring(round(diskTotalinMB / rawTotalinMB * 100, 2)) + "%"
  | table rawTotalinMB, diskTotalinMB, compression
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;but it didn't work.&lt;BR /&gt;
2) what does it mean when I get this:&lt;BR /&gt;
&lt;IMG src="http://i069.radikal.ru/1505/60/bc8639ccac55.png" alt="alt text" /&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 12 May 2015 08:24:08 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/compression-rate-of-indexed-data-50gig-day-in-3-weeks-uses/m-p/47349#M179435</guid>
      <dc:creator>ibondarets</dc:creator>
      <dc:date>2015-05-12T08:24:08Z</dc:date>
    </item>
    <item>
      <title>Re: compression rate of indexed data: 50gig/day in 3 weeks uses 100gig HDD space</title>
      <link>https://community.splunk.com/t5/Splunk-Search/compression-rate-of-indexed-data-50gig-day-in-3-weeks-uses/m-p/47350#M179436</link>
      <description>&lt;P&gt;I have a similar compression percentage as ibondarets. &lt;/P&gt;

&lt;P&gt;Guess that means our data is actually larger once indexed.&lt;/P&gt;</description>
      <pubDate>Tue, 12 May 2015 20:50:18 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/compression-rate-of-indexed-data-50gig-day-in-3-weeks-uses/m-p/47350#M179436</guid>
      <dc:creator>ConnorG</dc:creator>
      <dc:date>2015-05-12T20:50:18Z</dc:date>
    </item>
    <item>
      <title>Re: compression rate of indexed data: 50gig/day in 3 weeks uses 100gig HDD space</title>
      <link>https://community.splunk.com/t5/Splunk-Search/compression-rate-of-indexed-data-50gig-day-in-3-weeks-uses/m-p/708261#M239524</link>
      <description>&lt;P&gt;See my reply here if it can help&lt;/P&gt;&lt;P&gt;&lt;A href="https://community.splunk.com/t5/Deployment-Architecture/Splunk-Storage-Sizing-Guidelines-and-calculations/m-p/708258/highlight/true#M29013" target="_blank"&gt;https://community.splunk.com/t5/Deployment-Architecture/Splunk-Storage-Sizing-Guidelines-and-calculations/m-p/708258/highlight/true#M29013&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 08 Jan 2025 15:16:00 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/compression-rate-of-indexed-data-50gig-day-in-3-weeks-uses/m-p/708261#M239524</guid>
      <dc:creator>edoardo_vicendo</dc:creator>
      <dc:date>2025-01-08T15:16:00Z</dc:date>
    </item>
  </channel>
</rss>

