Hitachi Content Platform​

 View Only

New Partnership and Updates Unlock Data Value

By Timir Desai posted 07-14-2020 16:20

  

On a daily basis we engage with organizations searching for ways to get more value from their data.  We’re consistently presented with the same 5 object storage requirements:

 

 ·        Variety:  the breadth of data types continues to grow, making an adaptive storage solution a necessity.

·         Value: massively scalable solutions must support hybrid architectures to drive the greatest cost of ownership advantages by balancing capex and opex.

·         Velocity: data must be accessible quickly to support modern, Tier 1, low-latency workloads.

·         Veracity: long-term data integrity and durability is non-negotiable.

·         Vital: modern object storage use cases include mission critical application support, which is driving the requirement for high availability.

 

All of these requirements are important, and many are not new, but Velocity is the game changer.  This is because it’s NOT just about how fast data is being produced, changed or discarded, but the speed at which data must be accessed, understood, and processed for businesses to make quick and prudent data-driven decisions.
 

Traditional object storage offerings are typically hailed as ideal solutions to efficiently handle vast quantities of unstructured data. But the world is changing. The latest object stores must keep pace with the performance delivered by distributed block storage solutions and support modern workloads that have voracious appetites for scale and performance, create trillions of files, and are metadata intensive. Organizations are now rethinking the role of object storage and exploring its use for Tier 1 use cases and intelligent data services. To leverage their unstructured data for insights, IP and revenue generating activities, they are also looking at distributed file system (DFS) solutions to handle scale and performance for high-performance computing, real-time analytics and AI workloads.
 

 Today’s Announcement
 

The newly announced partnership between WekaIO and Hitachi Content Platform (HCP) will result in an integrated, best-of-breed solution for accelerating unstructured data management. But that's not all! For unstructured data that has a lower rate of change but still demands a high-performance storage target, we're delivering exponential increases in both performance and scale in HCP itself. With these latest enhancements, HCP should be a strong consideration for newer cloud workloads, machine data management and other use cases where object storage was assumed not to be fast enough in the past.

 




HCP performance optimized configurations
New G11 All-flash nodes

·         3.4x faster S3 throughput than previous generation and at as much as 34% lower cost.

New HCP version 9.x software

·         Optimal parallel throughput performance is achieved through a new capacity balancing feature that automatically balances storage capacity across HCP S series nodes in an HCP pool.

·         Leverage existing capacity on the latest VSP Hitachi storage through support for VSP 5000-series and E990.

New software for S Series storage nodes

·         Enables >3x increase in small object read and write performance. Reading small objects can be as fast as 40,000 per second.

·         Enables >2x increase in large object read and write performance. The HCP S31 node can achieve up to 8,600 MiB per second when writing large objects (100MiB).

·         Enables nearly 3x capacity in the same rack space than the previous generation and scales up to more than 15 PB disk capacity in a single rack, allowing more than an exabyte of data on-premises.


New OEM partnership with WekaIO

·         Enhances our portfolio with a NAS solution that offers a high-performance, NVMe-native, parallel file system. It will be coupled with HCP and introduced as a jointly engineered, Hitachi Vantara branded solution.

 


 


A next-generation object storage isn’t developed based on a singular design concept or criteria.  Rather, it’s the outcome from continuously evolving HCP, integrating and combining portfolio products related to performance, gateways, collaboration, and search and analytics - and expanding the partner ecosystem. This has allowed HCP to grow from a focus on archiving and compliance many years ago to supporting a wide range of use cases, including cloud storage, Hadoop and Splunk optimization, backup to object, and next-gen file services. By being able to address multiple challenges with a single solution, HCP has amassed thousands of customers.  Come see what the excitement is about!  Read the items below to learn what distinguishes Hitachi Content Platform from the rest.
 
#Blog
#HitachiContentPlatformHCP
#ThoughtLeadership
1 comment
3 views

Permalink

Comments

04-26-2022 13:31

Nicely written