Frozen bucket archival to s3 in splunk trial version from AWS Marketplace


HI all,

I am new in splunk admin and doing a poc on archiving the frozen bucket data to the s3 bucket. Can I directly provide the s3 Url in the splunk web under index setting or do i need to provide the archive script in the coldtofrozen directory.

Also while setting the archiving policy do I need to change only frozenTimePeriodInSecs or do I need to change both maxTotalDataSizeMB  and frozenTimePeriodInSecs.

Please excuse this silly question as i am new in Admin i am looking for the best practise 🙂


Labels (1)
0 Karma



Freezing buckets to something other than a mounted file system requires a custom script specified by the index's coldToFrozenScript parameter. The process is similar for both "classic" and SmartStore indexes:

Splunk provides an example script in $SPLUNK_HOME/bin/

If you're considering S3 and no other lower cost alternative like S3 Glacier, you may want to look at SmartStore (native S3 integration) instead.

There's nearly a decade of prior art on various ways to migrate frozen buckets to S3 but no "best practices" I'm aware of for implementing custom coldToFrozenScript scripts beyond the example provided by Splunk.

In effect, you'll be running an AWS CLI S3 command to copy <bucket>/rawdata/journal.gz to the S3 bucket of your choice.

0 Karma
Get Updates on the Splunk Community!

User Groups | Upcoming Events!

If by chance you weren't already aware, the Splunk Community is host to numerous User Groups, organized ...

Splunk Lantern | Spotlight on Security: Adoption Motions, War Stories, and More

Splunk Lantern is a customer success center that provides advice from Splunk experts on valuable data ...

Splunk Cloud | Empowering Splunk Administrators with Admin Config Service (ACS)

Greetings, Splunk Cloud Admins and Splunk enthusiasts! The Admin Configuration Service (ACS) team is excited ...