I have seen a question regarding this, but doesn't seem to explain much.....I'm looking to move multiple db_* to the thaweddb..I move the data like so:
cp -r db_140* $SPLUNK_DB/web_logging/thaweddb/
My question...what command can I do so that I can rebuild all the content in this folder, thaweddb? The time it takes to rebuild one db_* at a time is forever.....My Indexer is running on Linux.
index=web_logging
$SPLUNK_DB/web_logging/thaweddb
$SPLUNK_DB/web_logging/frozendb
Does restoring count against your daily index limit?
No, because that data was indexed before and you paid for that "index process". Don't worry about that.
Verify this with your Splunk account manager, but in my experience & training, previously indexed data does not have a license cost to it no matter how you move it around or rebuild from frozen to thawed. Now, if you manipulate the data and reindex it, then you'll have a cost since you are materially changing the indexed data.
So I ran the above command to rebuild the multiple db's within the Thawed bucket, took about 9 hours to complete. I suppose this is the only solution if you need to restore allot of data.
If possible, please provide an estimated GB of the 400 buckets.
9 hours = 400 buckets = ? GB
Thank you.
I'd like to know time and size as well
Is there a faster way of rebuilding the buckets? I can run the following command:
cd /opt/splunk/var/lib/splunk/web_logging/thaweddb ; ls | xargs -i /opt/splunk/bin/splunk rebuild {}
But I have over 400 buckets I need to rebuild??!!