When we set the datamodel acceleration, we see the All Time option. Can we truly have data back without limit? How can we measure the storage and processing impact of this setting?
Yes, you can have it accelerated for all time. You would still be limited to the retention period of the index that contributes to the data model. I would be careful with setting it to all time though, because of the disk space usage as you've mentioned.
You could probably get a better answer if you provided a specific use case. Or were you just curious?
I guess you are right @jnudell_2 about my real question - what's the relation between the Summary Range of the datamodel and the retention of the index that contributes to the datamodel?
As mentioned in the answer, the summary range will not go past the retention limit of the index, because data models are stored in the index folder, and are subject to the retention of the index they're modeling off of. Therefore, even if you set the data model to accelerate for "all time", it would only be for all available time for that particular index given the retention period.
Any thoughts on this one, by any chance?