Deployment Architecture

Is there an automatic way in archiving data in frozen data to s3 glacier?

ejmin
Path Finder

Hi, does anyone have an experience archiving data in S3 Glacier using a script or any third party apps. I already know the steps in uploading files in S3 glacier using aws cli commands but this kind of configuration is manual. My goal is to automatically upload all the data incoming in frozen directory to S3 Glacier. like how the splunk forwarder works

Labels (1)
0 Karma

isoutamo
SplunkTrust
SplunkTrust
Hi
Atlassian has done example cold2frozen python script which you could use. It copies buckets to your defined S3.
r. Ismo
0 Karma

ejmin
Path Finder

Yeah... I saw it on a google but I think it is for S3 only. Is it also applicable to S3 glacier because S3 and S3 glacier use different storage system. S3 is for quick access storage while S3 glacier are for low storage cost archiving

0 Karma

isoutamo
SplunkTrust
SplunkTrust
That’s good point. I haven’t looked if it can store also on glacier or not. But i think that at least it can modify quite easy to store glacier also or do it via S3 as temporary storage?
I need to check this later on.
r. Ismo
0 Karma

ejmin
Path Finder

Ok I will thank you in advance for that..

 

0 Karma

isoutamo
SplunkTrust
SplunkTrust
Just looked AWS glacier api and it shows that you cannot/should use it directly from cold2frozen scripts. I suppose that move first to frozen as S3 and then regularly to glacier as batch or using some other automated ways.
r. Ismo
0 Karma

ejmin
Path Finder

Hmm you have a point there, it is possible to do that but I think it will not be possible on our side because it will  have additional cost to our end which should not. Our estimated data are more or less than a Terrabyte and as the year goes it will increase 1 TB in a statistical manner. So our only choice is S3 glacier

0 Karma

isoutamo
SplunkTrust
SplunkTrust
If you are doing that migration couple of times per day and reserve 2-5 day spool for issues then I don’t see that S3 as a big financial issue. 1TB on glacier means 10-15Gb S3 storage.
0 Karma
Get Updates on the Splunk Community!

How to Monitor Google Kubernetes Engine (GKE)

We’ve looked at how to integrate Kubernetes environments with Splunk Observability Cloud, but what about ...

Index This | How can you make 45 using only 4?

October 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...

Splunk Education Goes to Washington | Splunk GovSummit 2024

If you’re in the Washington, D.C. area, this is your opportunity to take your career and Splunk skills to the ...