Hi Community,
I need to move current data from one of my indexes into an S3 bucket. Is that possible?
I read about the SmartStore feature. However, I need to move the data into another location after the index reaches a particular size.
The problem is that I have an index growing up like crazy. I need to have the data available for at least one year and be able to perform searches for at least six months. And of course, local storage is too expensive.
So, I am getting confused here on what would be the best approach to follow since that's the only index causing me issues.
In 60 days, I got around 300GB of data.
Thanks,
You can move only that index to S3. But because there is no concept of warm/cold when using SmartStore/S3 you will be migrating the full index to S3. (Splunk will take care of caching and all.)
First, don't mess with indexes behind Splunk's back. Unhappiness may ensue.
SmartStore can be enabled for a single index. However, you should use S3 for SmartStore only if your indexers already are in AWS. Otherwise, the data egress charges could be prohibitive.