Data volumes are growing at an unprecedented rate. Cloud storage, big data analytics, Industry 4.0 programmes and the adoption of IoT devices are combining to create a huge surge in the level of unstructured data that enterprises must store, manage and exploit for commercial advantage. One estimate from the International Data Corporation’s Digital Universe study suggests that by 2020, worldwide data volumes will have increased tenfold[MK1] .
How will your company cope with the challenge?
Consider too, that legal and compliance regulations such as GDPR mean that you must meet strict compliance rules on data protection and retain customer and client information for set time periods. And if you’re operating in a sector like pharma or finance, there can be a raft of further data governance requirements just to keep the lawyers happy.
On first view, this can all seem like an intimidating mix. There are headaches for your CIO and IT team as they play catch-up with the explosion of data, along with spiralling storage costs to upset your finance department. So, what’s the answer?
Conventional storage: don’t fall for the false trail
The natural response of many businesses is simply more, more, more: deploy new SAN systems, add more NAS boxes, and implement more file servers. In truth, this simply equates to running twice as fast in order to stand still.
What has become abundantly clear in recent years is that conventional, hierarchical storage architectures cannot scale efficiently enough to keep up with demand. Adding more storage boxes simply leads to an unsustainable level of sprawl and complexity within your data centre, making your infrastructure harder and harder to manage and govern effectively. Running backups, for example, becomes especially time-consuming and tricky.
The end result? A mass of separate data siloes, limiting the ability of your business users to search through and access files, preventing them from effective collaboration, and failing to uncover the insights that could be crucial to the success or failure of your next commercial venture.
Take the smart approach: Intelligent object storage
To equip your enterprise for the challenges of big data, you need an alternative approach. One that delivers scalability, makes management simple, allows you to set clear governance policies, and helps users to discover hidden patterns of customer demand, for example.
Object storage provides the answer. Rather than relying on a hierarchical architecture, object storage employs a flat pool structure, eliminating the problem of data siloes. Stored files exist alongside customised metadata as an object, each with a unique identifier. Applications can pinpoint the file location whenever users require them, instead of having to search through an extensive file directory.
And within the extended metadata that makes up the stored object, you can specify data retention and deletion policies, helping to ensure you meet your compliance and data protection goals.
Introducing Hitachi Content Platform
The Hitachi Content Platform (HCP) is a scalable cloud storage platform with built-in data protection, data security and data governance. The platform comes with features such as self-protecting storage, encryption, audit trails, immutable storage, legal hold and much more besides.
One of the most successful enterprise object storage platforms on the market, HCP has more than 2000 customers, a partner ecosystem of nearly 500 focus partners, and more than 100 ISV partners and nearly 200 certified partner applications.
The platform enables you to store and protect the massive data volumes that are created by multiple enterprise applications. You can then sync and share files to provide remote users secure access to data via their mobile device of choice, while working on the move.
HCP is fully cloud compatible, too, so you can provision a private cloud for your enterprise and begin storing data from your existing file servers, ECM solutions and other core systems. The platform supports all standard public cloud protocols, offering you the flexibility to outsource encrypted data from your internal systems to Microsoft Azure, Google Data Platform, or Amazon Web Services.
Navigating the big data ocean
The extended metadata associated with each file offers much greater visibility of where your data assets reside. The speed, reliability and power of object storage makes it much easier for your business users to search for and retrieve the information they need, helping them to plan smarter analytics jobs.
Alongside HCP, you can also deploy the add-on Hitachi Content Intelligence (HCI) to augment data quality for efficient data analytics and insights. Users can enter a specific request to HCI; for example, for customer purchase data to help you spot trends in demand. HCI extracts and categorises the data from HCP, and presents a set of recommendations based on the original request. This places the right data in your users’ hands to help them make smarter commercial decisions.
What’s more, enhanced visibility across your storage assets cuts through the problem of dark data that often comes with using a mish-mash of siloes in your data centre. And automated data tiering within HCP organises content according to its business value or SLA, again contributing to a more cost-effective, policy-driven storage strategy.
Data management, the easy way
HCP also completely removes the need for you to run backups. Normally, companies turn to tape storage or rely on costly backup management tools to cope with the pressure of ensuring continuity as data volumes rise. With almost all other object storage solutions, you still have to run backups for unstructured data, as every time you update an object, you lose the original if there is no copy.
To sidestep this problem and ease the workload of your IT team, HCP stores both the new version of the updated object as well as the original, in two separate locations. That way, you always have access to a point-in-time copy of your data, and you no longer face the headache of having to find an efficient way to back up masses of new files.
Putting your data safely to bed
When you come to store data long-term, HCP provides a smarter, more flexible approach than a conventional content archive. For a traditional archive, you need an index application, database resources and file server, and you can read and understand data only via its original application. That can mean a time-consuming and tedious process.
HCP – thanks to extended metadata and the use of cloud protocols such as S3 and Swift – turns this approach on its head. Instead, HCP enables you to understand archived data in an instant, without having to worry about deploying the application used to create the file. As compliance demands increase, that ability to drill down and understand your archived data immediately could prove particularly valuable.
As applications reach their end-of-life, you can use HCP as the target platform to archive everything from mail servers and document stores to core SAP systems. Data, metadata and policy rights sit together in a secure, consolidated archive within HCP, with users able to view information via an intuitive portal.
Big data: are you ready?
Data growth is an inescapable fact, and your future storage strategy will be an important factor in your ongoing success as a business. Working with Hitachi can help to ensure you turn the flood of big data into an ocean of opportunity, keeping your storage costs under control and outsmarting your competition.
For more details of how Hitachi Content Platform can help your business prosper as data volumes grow, please visit: https://www.hitachivantara.com/en-us/products/cloud-object-platform/content-platform.html