Reporting

How far back in time will datamodel data be kept?

jamesvz84
Communicator

How far back in time will datamodel data be kept? I have a datamodel with a large quantity of data and I am wondering if the latest records will ever "roll off" similar to the bucketing settings, or are the other limits (besides disk space) for that I should watch out for as the datamodel data grows larger?

Tags (3)
0 Karma
1 Solution

Ayn
Legend

That's defined when you accelerate a data model - "summary range". You can see which range is currently defined for each data model by looking at the acceleration info for the data model in the splunkweb settings, look for the summary range there - that's the retention period.

View solution in original post

Ayn
Legend

That's defined when you accelerate a data model - "summary range". You can see which range is currently defined for each data model by looking at the acceleration info for the data model in the splunkweb settings, look for the summary range there - that's the retention period.

Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...

New in Observability Cloud - Explicit Bucket Histograms

Splunk introduces native support for histograms as a metric data type within Observability Cloud with Explicit ...