Blogs

Tier-1 Performance Workloads Can Enjoy the Benefits of Object Storage With Hitachi’s HCP

By Hubert Yoshida posted 06-22-2021 19:04

  
tier 1workload.pngObject storage began as an economical way to archive large amounts of unstructured data. Today vast amounts of data have been archived on object storage where they also gain the benefits of scalability, protection, compliance, and cost efficiency. Up to now, enterprises have been reluctant to use object storage for tier-1 application due to the meta data management overhead associated with object storage. Now Hitachi Vantara makes it possible for organizations to realize these same benefits, and more, with their Tier-1 workloads such as AI/ML, analytics, data warehouses and S3 cloud applications while maintaining the performance levels that they require. Tier-1 workloads are also seeing an increase in threats such as ransomware and increasing compliance requirements like GDPR (General Data Privacy Regulation) and CCPA (California Consumer Privacy Act) which can be addressed more easily with the immutability and data protection of object storage systems like HCP.

While object storage is associated with petabyte-scale data volumes and economical long-term data retention, its meta data structure was thought to impact performance in comparison to file and block storage systems in the tier 1 enterprise mainstream. Recent enhancements to technology such as the use of Flash and NVMe, multi-core processors, and virtualization have closed that performance gap. Software virtualization of the OS (VM instances), application (containers) and storage resources (software-defined storage) interpose an abstraction layer between hardware implementations and applications. By decoupling the storage data and control planes, virtualization enables distributed, scale-out clusters of any size and capacity to increase overall performance.

Hitachi Vantara’s object storage solution also has a unique approach to enhancing the search capability of tier-1 workloads. Hitachi Content Intelligence automates the extraction, classification, enrichment, and categorization of data residing on both Hitachi Vantara and third-party repositories, located on-premises and in clouds, and across heterogeneous data repositories (internal and external). This approach drastically reduces time spent searching for what is needed or recreating what already exists. Additionally, Content Intelligence delivers:
 
  • Guided data exploration based on automated classification and categorization.
  • Immediate visibility to all your data by unifying data access across disparate locations and data types.
  • Important insights by transforming your data into valuable business information.
  • Managed access to sensitive data with granular access controls and security system integrations.
  • The right data to the right person at the right time, using personalized and self- service features and user experiences.


Last year (October 2020) ESG published a Technical Review and Lab Validation of Hitachi Content Platform: High-performance Object Storage for Tier-1 Workloads. In this technical lab validation, ESG reviews performance tests and results that were based on real-world customer configurations and concludes HCP object storage delivers the high performance and scalability that enterprises demand of their business-driven workloads. These performance results change the role of object storage. Now, the object storage advantages of massive scalability, fast data retrieval, and cost efficiency can be used with tier-1 production workloads. The results Showed:
 
  • Average large object GET and PUT performance of 14GB/sec to 40GB/sec throughput
  • Average small object GET and PUT performance of 44K to 141K operations/second.
  • Time-to-first-byte performance of 15 ms or less

ESG also reviewed several customer examples that demonstrate the real-world performance benefits of HCP object storage.
 
  • 1TB/minute with Exabyte Scale. A government customer used 54 HCP nodes to deliver 1TB/minute throughput for a 22 PB Hadoop data lake using the S3A protocol. This customer was collecting huge amounts of streaming data including voice, data logs, machine logs, and security event logs for analysis. The HCP solution delivered faster, more accurate data insights across multiple data sources with cost efficiency. The customer expects to grow to more than 80 PB within two years, with all data retained for a year.
 
  • 1 trillion objects, 12 GB/second. Using an HCP all-flash configuration, a customer in the financial services industry with heavy growth ingested one trillion objects over 12 petabytes of storage, across 56different applications, and maintained12GB/sec small object (e.g., emails, PDFs, metadata) performance while ensuring full regulatory compliance. Metadata, indexing, and search functions supported all business use cases interfacing with the data, including legal hold, compliance, dispositioning, and high-performance search. Performance was critical since this archive is part of the company’s primary customer interaction workflow. The customer extracted the data and metadata, offloaded, and retired a mainframe environment, gained massive growth with performance at scale, and deployed new use cases. Furthermore, by consolidating multiple data sources onto HCP, they were able to save $100M in administrative and maintenance costs over five years.
 
  • 15ms response rate. Another government customer needed to host client data review processes and wanted to provide sustained performance of 50 ms or less for time to first byte, so that employees could work productively. The HCP all-flash solution overachieved this objective and delivered a sub-15ms time to first byte to support the customer’s one PB of data, enabling higher productivity and delivering the needed customer experience.
 
  • High-performance consolidation. Another financial services organization wanted to consolidate regulated unstructured data from content management and data repositories, taking data from highspeed Fibre Channel storage and NAS platforms. Using multiple all-flash HCP clusters, this company consolidated multiple service tiers into a single, high-performance tier.

Organizations need more data, faster, to deliver insights that drive business decisions. They also need greater protection against threats like ransomware, greater transparency for compliance, and greater visibility into all their data which can only come about through the rich meta data capabilities of an object storage system like Hitachi’s HCP. According to ESG research, 53% of respondent organizations expect to accelerate spending for on-premises object storage. 32% of those organizations that are accelerating their investments in on-premises storage identified AI/ML as workloads that will be responsible for storage spending growth, while 38% of those organizations identified IoT and 29% identified data warehouses as drivers of spending growth.


#Hu'sPlace
#Blog
0 comments
7 views

Permalink