I am indexing to the main index and it has a max size of 500000 MB defined. So far, I have indexed about 14,000 MB of data. What I noticed is that 14 GB of my hard drive was taken up right away and it appears to correlate to the 14,000 MB of data indexed. At this time, I only have 40 GB total on my hard drive. So, does this mean that the 500000 MB max size defined won't apply because I only have 40000 MB of hard drive space? Would I need to increase my hard drive to 500 GB to fully utilize the 500000 MB definition?
Thanks for any insight/direction that can be given!
This is within the expected size range:
Typical ASCII syslog data takes about 50%, but it could be anywhere from 10% to 200%, with a typical range from 20% to 120% of the original source data size.
I was not able to get the "for" line executions to go, but I did setup a temporary Splunk install elsewhere to analyze the data storage size of one of my data inputs. It turned out that the reported data stored was about 3 times the size of the actual data read in.
So, I am correct in assuming that I would definitely need 500 GB to fully utilize the default 500000 MB max size set for indexes right?