So I've seen many posts that seem to cover parts of this but here's what I'm looking for:
have a single instance of Splunk running on a physical server with 1TB of HD. Don't have enough space to hold the required amout of data so out network team has added me to the SAN and given me 2 TB. So now I have a 😧 drive where Splunk is installed and also the Data, and an E: Drive that is empty.
What is the best way to distribute this space so I can maximize the historical searching I can do as well as get best performance?
I'm a little fuzzy on the hot/warm/cold buckets concept as I only currently have 1 area defined to put everything...
Thanks!
Found enough detail in the manual to do the move myself. It involves shutting down Splunk, moving the files, editing the configuration prior to restarting splunk to look for the data in the new location....
99% complete as there are still a few files (not huge database) that are being updated in old path but I can live with that. the core database files are now collecting on the new drive.
Did you complete this? I have to do something similar now and am curious about how this went and what you chose to do.
this documentation topic explains how indexes/buckets work overall, it might be helpful in making your decision:
http://docs.splunk.com/Documentation/Splunk/5.0.2/Indexer/HowSplunkstoresindexes
this topic is about performance of search vs indexing:
http://docs.splunk.com/Documentation/Splunk/5.0.2/Deploy/Distributeindexingandsearching