I have dynamic archive storage alotment and with professional services we migrated our on-prem archive for buckets containing data up to 365 days old to archive storage however if i select archive in the indexes view I see two large indexes in particular that for some reason show data start dates in 2017 and feb 2019 (365 days retention would be october 2019). which are pushing me over my storage allotment... is there a way i can query or view (read-only) the bucket detail in dynamic archive storage (splunk S3) and their dates in order to identify which buckets have a long data span dates and request these to be removed?
I have opened a ticket with support and they are investigating also...
I realize that the retention removal is actually based on bucket end dates so i must have some buckets that for some reason have corrupted dates or span from 2017 or oct 2019 with end dates later than October 2019...
Is there a way for the cloud admin user to query the dynamic archive storage to list data dirs to discover buckets and there individual data start and end dates...???