Amazon Glacier - Offsite Backup

Amazon Glacier is an extremely low-cost storage service that provides secure and durable storage for data archiving and backup. In order to keep costs low, Amazon Glacier is optimized for data that is infrequently accessed and for which retrieval times of several hours are suitable. With Amazon Glacier, customers can reliably store large or small amounts of data for as little as $0.01 per gigabyte per month, a significant savings compared to on-premises solutions.

Looks like the big dog has decided to compete with Crashplan and the like.

I got excited and signed up. It verified me, everything looked cool, I clicked on the link they sent me and I’m staring at the Amazon Web nexus web page with 20+ services listed, all named something weird, and not a single mention of glacier anywhere on the page.

Yikes.

I don’t think “user-friendly” was in the terms of service anyplace.

In that case, they nailed it. This is for hardcore tech geeks - the complete opposite of Crashplans approach.

Yeah, this thing is designed for enterprise-level archival backup, and the intent is that you use their provided APIs to manage the data. The GUI is kind of a nightmare.

Yeah. I need a low-cost massive off-site backup solution, but this looks a little too raw at the moment; some the commands make it feel more like I’m storing my files in an SQL database rather than a file server somewhere. Which might very well be the case, but I don’t really wanna futz around with command words and the like rather than an uploader/downloader.

If they design a decent frontend and make it simple to maintain incremental backups of stuff on my PC and then retrieve it adhoc or alltogether (the vaults terminology they use makes it sound like you’re pre-apportioning your data into discrete, separately retrievable chunks, but must do so manually), then I’m all for it. Otherwise, err, no.

Yeah, the Amazon AWS interface is slowly getting better, but it’s still atrocious. Managing images and file backups for their VM hosting EC2 stuff is a huge pain as well.

10 bucks/month per TB isn’t cheap compared to 5 bucks for unlimited w/ Crashplan, Backblaze, et al.

I don’t think you can use those for enterprise level storage though. If I could back up work’s 15TB of data for $75/month, I’d have to seriously consider that.

Huh. We are #5 on google right now for information about Amazon Glacier GUI. Too bad there is no GUI.

Enterprise | CrashPlan | Endpoint and Cloud Backup Solutions. I’m pretty sure it costs more than that, but I don’t know pricing.

Full disclosure - I work for Code 42, the company who makes CrashPlan. I don’t know much about the enterprise side of the business, and at least some of that is Code 42 selling racks of servers as a CrashPlan appliance. There are some big companies with a lot of data that use CrashPlan for backups, and I was surprised to find out how much data we manage directly.

Glacier strikes me as a cool archive option, but less useful for backups, since it looks like Glacier gets more expensive if you actually do restores. 15 TB means that if you retrieve more than 750 GB a month than you get hit. So, $150/month, plus a significant chance of the transaction cost, plus the fee per upload. Still, if you have to store large archived data sets instead of doing regular backup/restores, it’s potentially nice.

Oh, I agree, and that’s probably who it’s aimed at. If you want/need to customize your backups w/ the flexibility the API’s offer it seems to be a good option.

It’s just not going to compete for individual or SMB customers as currently priced, and often that’s a Trojan horse for enterprise accounts. Akin to iPhone/iPads getting rolled out after the CEO gets one. :)

Oddly enough I was at a conference listening to a presentation from Amazon AWS when the speaker was interrupted to announce that this service went live, literally minutes beforehand.

Definitely Enterprise focused - which just happened to be the audience in the room at the time as well.

The main thing here is not the cost, it’s the API. If you’re using a content management system or DAM or similar, you need that thing to be able to automatically retrieve files from this service. If it can’t, you’re out doing it manually and that costs time and labor. We’ll use the crap out of this thing, I expect, but we have a few hundred terabytes of digital assets.

We do most of this stuff via chef and only bake in chef-client and some very limited other stuff, so our image creation process is A) really easy and B) fully scripted. I’m also using some PHP code I stole from the internets to manage backups and snapshots - be happy to share if you’ll shoot me a PM.

Thanks for the offer, but our usage of their EC2 stuff is still pretty minimal, so it’s manageable even with the awkward interface. Also, I personally stopped working with them a couple months ago, so I don’t care so much about it anymore :)

the hacker news commentators are calculating that retrieval costs will be outrageous.

QT3 is weirdly high-ranking in stuff. I frequently make topics that I search on only to find the QT3 post I started in the top result/first page 30 minutes later.

I lol’d at “Amazon Glacier is optimized for data that is infrequently accessed and for which retrieval times of several hours are suitable.”

It’s well named, at least.

We backup maybe 50TB a month, that’s 50,000GB. And at 50TB the pricing is nowhere near $.01/GB ($.12 for the first 10TB, $.09 for the next 40TB, or 10X more than the advertised $.01/GB), so my math came out to be $4800/month. Plus the whole “wait several hours” part.

Even archiving our monthlies would not be economical.

This service is not for my business. Tape is still too cost effective.

Those are the prices for retrieving the data - the cost to store it is always $0.01/mo (in the US-East region, it’s more in certain other regions). The cost to transfer the information in is nothing.