New service promises low-cost long-term data storage in bid to lure customers away from tape
Amazon Web Services is going after the tape storage market with its new S3 Glacier Deep Archive service, which promises inexpensive data storage that will make cloud-based archiving more attractive to customers than legacy systems.
The Lowdown: S3 Glacier Deep Archive enables businesses to use AWS for long-term data storage without the investment in or complexity of on-premises tape-based systems. In launching the service, AWS is going after the tape-based storage segment, which still has a substantial share of the archival market.
The Details: The new service promises cold storage for just $1 per terabyte of data per month. AWS says the cost is substantially lower than traditional tape and off-site storage options many businesses use today for long-term storage needs and regulatory compliance. Moreover, AWS says S3 Glacier Deep Archive enables users to recover deep storage data within 12 hours — a substantial time savings over
AWS is partnering with several storage software vendors, including Commvault and Veritas, to provide the tools for using and managing S3 Glacier Deep.
The Impact: The cold storage market opportunity is substantial. According to Enterprise Storage Forum, 68 percent of businesses worldwide employ some form of cold storage technology for their long-term data archiving needs. Tape-based storage systems are used by 23 percent of businesses worldwide.
Background: AWS already offers a cold storage service, S3 Glacier. The difference between the old and new services is the cost, retrieval speed, and data durability rating. S3 Glacier Deep Archive competes against similar services offered by Microsoft and Google.
The Buzz: “As the demand for higher quality and increased amounts of content continues to rapidly grow, we will now have the ability to eliminate the limitations of a hybrid on-prem tape model by using S3 Glacier Deep Archive to reduce access time and rapidly shift the availability and workability of content sources exclusively on the cloud,” said Andy Shenkler, Chief Product Officer at Deluxe, a video creation-to-distribution company with headquarters in Los Angeles and New York. “AWS’s S3 Glacier Deep Archive addresses the challenges that have previously existed around the economics and time lines associated with accessing and utilizing large media assets throughout every step of the content creation and distribution process.”
“Our customers need to be able to move, manage, and use data in a way that promotes business agility and contains costs,” said Karen Falcone, Vice President of Worldwide Cloud and Service Providers at Commvault. “With Commvault’s support for AWS, customers get a single, comprehensive data management platform with full data protection, backup, recovery, management, and e-discovery capabilities—all tightly integrated with AWS services. And now, S3 Glacier Deep Archive will allow us to provide the lowest-cost storage available in the cloud and have it accessible, if necessary, in the future. For our customers in regulated industries, that can mean petabytes of data going back years. Customers can use S3 Glacier Deep Archive today as an Early Release feature.”
“Our customers need to be able to harness the power of their information with solutions designed to serve the world’s most complex and largest heterogeneous environments while accelerating digital transformation, reducing risk, and delivering cost savings,” said Cameron Bahar, Senior Vice President and CTO at Veritas. “With Veritas solutions supporting AWS, we continue to extend our support for cloud usage models and provide our customers simple and agile solutions to solve complex data management issues for backup/recovery, archiving, primary storage, and disaster recovery use cases. With Amazon S3 Glacier Deep Archive, Veritas will be able to help customers increase their savings even more significantly. Veritas customers can use S3 Glacier Deep Archive Standard tier with the latest NetBackup version as of today.”