Happy holidays everyone! Like many of you, my family recently spent time bringing down our Christmas stuff from the attic. The ornaments, lights, holiday cards and other decorations had been in storage since early January. And our ugly sweaters had to be pulled out from deep within our closets. This got me thinking about how we manage our data, particularly the data that isn’t regularly accessed all the time but, like Christmas decorations, we still need to keep around.
The fact is that data which may be critical to have at our fingertips now, won’t always be so important as it ages. In our conversations with customers, they tell us that more than 60% of unstructured data, such as documents, powerpoints, and images, are considered cold if not accessed within 30 days. But these cold files could be needed again so they can’t just be deleted. Yet they still consume the same compute, memory and storage resources as long as they remain in tier 1 storage. These files are also copied every time a full system back-up is run.
Hitachi has solved this problem with our Data Migrator to Cloud software which enables the automatic tiering of less used files from a Virtual Storage Platform (VSP) to Amazon Web Services, Microsoft Azure, Hitachi Content Platform, or to our Hitachi Managed Cloud Service. Data Migrator to Cloud gives the administrator granular control of what and when to tier data from their VSP to a public or private cloud. And after the data has been moved, the application will still be able to locate the data at the same logical address since pointers to that data will remain stored within the VSP.
All the administrator needs to do is to define a migration policy that looks at file type, size, and age. Then the “auto-magic” tiering will kick-in. For example, the administrator can define a policy such as “any ZIP files in excess of 5MB, and 30 days since last access” will be automatically migrated/tiered. The policy can be separately set for different file types to best support the business objectives.
A recent study by IDC shows that Cloud comprises 49% of enterprise storage spend in 2016 moving to 54% by 2020. Much of that spending on cloud will be used to store archived data. This is particularly economical as organizations transition to all-flash datacenters. The trend here is to enable data with infrequent access rates to be tiered to the cloud so it doesn’t consume the high priced real estate of an all-flash data center.
If you want to learn more about this topic, you can check out a recent webcast that I did with Eric Burgener, IDC Research Director.
So while the need to access older data may not be as predictable as bringing Christmas decorations in and out of storage, we still can make use of the cloud as an unlimited attic.
See you again in 2017!