Champions Corner

2 people like this.
Veeam 12 introductions of direct-to-object storage enhance backup performance, simplifies management, and improve data protection against ransomware threats when using Hitachi object storage as a primary backup target.   The direct-to-object storage feature brings several benefits.   Enables faster backups and restores, which helps streamline disaster recovery efforts and reduces recovery point objectives (RPOs) and recovery time objectives (RTOs).  By eliminating the need for tiering, the backup process becomes more efficient and faster.   Additionally, this approach optimizes storage by allowing massive scalability ...
0 comments
4 people like this.
On-premises S3-compatible Storage for Snowflake with Hitachi Content Platform   With their June 2023 update, the folks at Snowflake announced general availability of S3-Compatible storage support for Hitachi Content Platform and HCP for Cloud Scale (collectively HCP). With this announcement customers can directly integrate their HCP data sets either on-premises or in collocated private datacenters with their cloud-based Snowflake compute to enable hybrid cloud workflows. Snowflake’s data platform has revolutionized data warehouse as a service with their impressive suite of capabilities resulting in rapid adoption and tremendous growth. Snowflake’s ...
0 comments
4 people like this.
Nasuni integrates with HCP via the S3 API, here are quick steps to get HCP set up to integrate with Nasuni. For instructions to set up Nasuni for HCP, please refer to the document How do I configure Nasuni for HCP? System Administrator Configuration Log into the System Management Console (SMC). On the SMC => Tenants page Create a new tenant for Nasuni. Give the tenant enough Hard Quota and Namespace Quota for all the Nasuni filers you expect. Nasuni will create one namespace per volume. On the SMC => Configuration => Protocol Optimization page, set the default to "Default new namespaces to optimize for cloud protocols only." Click Update ...
3 comments
3 people like this.
Introduction Logstash Configuration Input Section Filter Section Output Section Elasticsearch Indexes and Managing Indexes w/ Kibana Visualizing Elasticsearch Data w/ Kibana Monitoring HCP with ELK - Step by Step Step 1: Configure HCP for Monitoring Step 2: Logstash Configuration Step 3: Confirm Index Creation Step 4: Create Your Index Pattern Step 5: Import The Visualizations and the Dashboard Step 6: View The Dashboard Step 7: View and Edit Visualizations Step 8: Get to Know Kibana Tips and Tricks Elasticsearch Kibana Logstash Troubleshooting ...
3 comments
2 people like this.
Introduction Installing ELK Step 1: Disable Firewall or Open Ports Step 2: Install Java Step 3: Install Elasticsearch Step 4: Install Kibana Step 5: Install Logstash Conclusion Introduction This guide is the first in a series explaining how to use open source ELK to visualize the performance of a system. This post includes instructions to install the ELK software. The second guide in the series, Performance Monitoring w/ ELK - Part II: Monitoring HCP Access Logs , gives instructions to configure HCP and your newly installed ELK software to visually monitor HCP. Following the instructions in these 2 posts, ...
2 comments
2 people like this.
The AWS Java SDK does not natively support Active Directory authentication, but it is flexible enough that with a very little bit of coding you can use your AD credentials with HCP over the HS3 gateway. Follow this link for a working code example that uses active directory credentials to interface with HCP using the AWS Java SDK. This is not intended to be a general S3 programming example, it is strictly intended to demonstrate how to use AD with HCP and the AWS Java SDK. This is intended for an audience that is already familiar with AWS Java SDK programming. In order for this to work you will need to be on HCP version 8.0 or higher. You cannot create a ...
3 comments
2 people like this.
Hi, this is just a quick post to share a particularly helpful method for troubleshooting issues between a Java client application and the HCP S3 Gateway. Most Java based software will allow you to inject Java System Properties at launch time, either by editing a configuration file or a launch script. This post does not cover how to achieve that step, to answer that question use the product documentation, Google, or ask their support team. If you will be adding Java system properties by configuration you want to add the followinq name value pair (choose the correct value for your system type): Name Value log4j.configuration file:///home/user/log4j.properties ...
2 comments
4 people like this.
HCP chargeback reports contain valuable information that is useful toward understanding HCP utilization and workloads. The problem is that the data can be overwhelming. Trying to understand this data in it's tabular form is not humanly possible. What we need to understand this data is visual representation, but building charts and graphs is time consuming isn't it? Actually no, you can visualize chargeback report data in under 5 minutes using the PivotChart features in Excel. Read on to find out how. In the HCP System Management Console go to the Monitoring => Chargeback page. Select the range of dates you would like to report and choose Hour or Day reporting ...
2 comments
2 people like this.
Object storage has transitioned from an archival and Tier 2 backup solution to a digital age powerhouse for its massive storage capacity, processing power and security capabilities. The digital age has generated a deluge of data, much of it unstructured. Faced with high on-premise and cloud storage costs, the inability to perform data analytics, and the increased risk of non-compliance with regulations protecting consumer data, forward-looking organizations are turning to object storage to manage, analyze, protect, and even build critical applications supporting this data.  Storing data in its native format means it can be managed ...
1 comment
1 person likes this.
Use Case:  Customer wants to be able to track who accessed when a specific tenant and did what action for whatever object. This customer has no HCM instance deployed.  If HCM would have been deployed, the procedure would be even more simple as HCM is already collecting all syslog information from HCP.  HCM is not making use of the data we need as HCM will only visualise performance metrics and will drop the audit data we need. If no HCM is installed, configure syslog of all HCP's to be audited to send syslog output to the HCI server on port 6901, actually the same as you would do for HCM (logstash is listening on port 9601)  If HCM is deployed, you don't ...
2 comments
1 person likes this.
On a daily basis we engage with organizations searching for ways to get more value from their data.  We’re consistently presented with the same 5 object storage requirements:    ·         Variety:   the breadth of data types continues to grow, making an adaptive storage solution a necessity. ·          Value: massively scalable solutions must support hybrid architectures to drive the greatest cost of ownership advantages by balancing capex and opex. ·          Velocity: data must be accessible quickly to support modern, Tier 1, low-latency workloads. ·          Veracity: long-term data integrity and durability is non-negotiable. ...
1 comment
1 person likes this.
Paul Lewis posted a blog the other day about becoming an agent of change that opens with the following: "Change has been thrust upon us all over the last few months, as COVID-19 and its impact continue to dominate global headlines. As employees are urged to #stayhome, businesses are having to rapidly abandon old ways of working and embrace radically decentralized new operating models." That got me thinking about how #workfromhome and data collaboration are no longer a luxury, but a necessity. In looking at articles and research data like the below, it's apparent that work from home is going to be more prevalent going forward. ...
1 comment
1 person likes this.
HCP for Cloud Scale v1. 5  new features: S3 select The S3 Select Object Content method allows retrieval of a portion of a structured object by an S3 client such as Apache Spark, Apache Hive, and Presto. S3 event notification HCP for cloud scale can send notifications of specified events in a bucket to a message server for applications to consume. Compared to periodically scanning objects in a bucket, this is an efficient way to signal changes. S3 object locking HCP for cloud scale supports object locking, which prevents specified objects from being either modified or deleted. #Blog ...
1 comment
1 person likes this.
Pursuing a digital transformation strategy requires a proven storage solution offering flexibility, massive scale, high performance, metadata-based intelligence and powerful data management capabilities.   Hitachi Content Platform delivers these capabilities and more.  Indeed, f or the fourth time, consecutively , Hitachi Vantara has been named a “Leader” in IDC’s MarketScape for Object-Based Storage.   Hitachi Vantara Rating Was Based on IDC's Assessment of our Hitachi Content Platform Portfolio I DC analysts evaluated 13 products from the world’s best-known object storage vendors, measuring them based on their ability to deliver ...
1 comment
1 person likes this.
Don’t compromise your cloud:  ​​get the technology, storage, infrastructure and services to innovate The Center of Excellence offers support which includes Demo`s, PoC`s and sales/pre-sales support functions. For more information CLICK HERE To Request support CLICK HERE In addition to the product/solutions​ below, the CoE can also assist with  3rd Party products and services :  Veritas Netbackup; Commvault Simpana; Moonwalk, POiNT; Veritas Enterprise Vault; STAR Storage SEAL; Metalogix, bespoke services (from a 3rd Party); HCP-AW SharePoint Connector BEP Systems; ISV Intergration & APP development assistance.​ See below the Demo overview. ...
1 comment

HCP for Cloud Scale Lab

1 person likes this.
This lab is hosted in  Hitachi Automated Labs Online  (HALO). It includes two servers containing a minimal CentOS-7 installation. You will validate the environment, execute a series of pre-configuration steps, install HCP for Cloud Scale (HCPCS) software on master and worker nodes, and perform numerous software administration tasks. Upon completion, you will be competent to discuss and plan server configuration topics relevant to hosting and Installing HCPCS software.   Download the Lab Guide to get started.   Recommended learning sessions Learning Goal Learning Goal I’d like to install a Cloud-Scale master node I’d like to ...
1 comment
1 person likes this.
Announcing Hitachi Content Intelligence v1.5 Maximize the value of your enterprise data, wherever it resides, to deliver the best quality information to where and when it’s needed most.      From deeper integrations with Hitachi Content Platform, to customized searches, and improved data processing and scalability, the latest release of Content Intelligence comes with many great new features and enhancements to an already-robust data processing solution.  I will talk about most of these improvements in this blog. Tighter Integration with Hitachi Content Platform Content Intelligence can connect to ...
1 comment
Be the first person to like this.
Nasuni integrates with HCP via the S3 API, here are quick steps to get Nasuni set up to integrate with HCP. For instructions to set up HCP for Nasuni, please refer to the document  How do I configure Hitachi Content Platform for Nasuni? Configuring Hitachi Content Platform credentials on a Nasuni Edge Appliance Note: If using the Nasuni Management Console to manage the Nasuni Edge Appliance see Configuring Hitachi Content Platform credentials on Nasuni Management Console.   To configure HCP credentials: On the Nasuni Edge Appliance, click  Configuration , then select  Cloud Credentials  from the menu. The  User Provided Cloud Credentials  page ...
0 comments
1 person likes this.
Hitachi Content Intelligence delivers a flexible and robust solution framework for comprehensive discovery and quick exploration of critical business data and storage operations. Whether your data is on-premises, off-premises, in the cloud, structured, or unstructured, Hitachi Content Intelligence (Content Intelligence) delivers a powerful framework of tools for connecting to, transforming, and acting upon organizational data to maximize the value of it for better business outcomes. Using the Content Intelligence Workflow Designer, you can create customized workflows to connect to all of your data repositories, transform that data with a comprehensive ...
5 comments