I would like to export index data from my Production instance of Splunk and import that same index data into a Test instance of Splunk for the sole purpose of evaluating apps, refining searches and education purposes. The Test instance will not need to receive any further data, ideally it will exist on a standalone server and I don’t mind exactly how much data I have in the index (perhaps 60 days) as the sources have been consistsnet for some time now. If I can do something as simple as copy some cold index files that approach works too.
I have reviewed the wiki article ”How to move an index from one Splunk installation to another” (http://wiki.splunk.com/Community:MoveIndexes) and it appears I only need to follow steps 2 and 3. Can someone please offer some advise as to whether this is the best approach to achieve my goals of evaluating and tuning potential apps, refining searches and education purposes.
Yep thats pretty much it.
Don't copy the .bucketmanifest and splunk will re-read the buckets and automatically re-write it.
You can take a subset of the data by just taking which ever buckets you want (based on the marked epoch times) from the cold or warm storage. As its a new test instance you won't have to rename id's or anything like that.
Thank you for the response.
I have tried adding a subset of the data by copying a single folder from last month (based on epoch time) however when I restart Splunk the data is not picked or indexed. Are you able to please elaborate upon your instructions so I can examine why it isnt working? As a side note the data is from a index called domain as oppose to the default index of main.
Ensure that your user/role has access to the index in question. Go to
Manager -> Access Controls -> Roles -> <your role>.
At the bottom you'll find two settings for access rights, and which indexes are searched by default.
Is this approach can be used in cluster setup? because the data is pretty much sharded to different hosts.