Skip navigation
1 2 3 Previous Next

Hu's Place

258 posts
Hu Yoshida

How Cool Is That!

Posted by Hu Yoshida Employee Jun 12, 2018

CRN, Computer Reseller News, a leading trade magazine, has named Hitachi Vantara as one of the 30 Coolest Business Analytics Vendors.This may be a surprise for many to see Hitachi Vantara, part of a 118 year old company with traditional values like Harmony (Wa), Sincerity (Makoto), and Pioneering Spirit (Kaitakusha-Seishin), in the middle of a list of technology startups.  Hitachi and Hitachi Vantara considers business analytics to be one of the key drivers for our customer’s success in this age of big data, digital transformation and IoT and is approaching business analytics with the same “startup” or pioneering spirit that has sustained us for over 118 years.

 

Cool Analytics.png

Hitachi Vantara’s appearance in this list of 30 “cool” companies may also be surprising from a “coolness” standpoint. Most of these companies are hip new startups. The next oldest company is Microsoft, who like us, have had to reinvent themselves, many times to remain relevant.

CRN.png

Actually, Hitachi Vantara is the new kid on the block since it was formed in September of 2017 with the merger of Hitachi Data Systems (IT infrastructure systems), Hitachi Pentaho (Data Integration and analytics), and Hitachi Insights (IoT). CRN recognizes that Hitachi Vantara is able to provide, “cloud, Internet of Things, big data, and business analytics products under one roof.” CRN cites Pentaho as a core Hitachi Vantara product for data integration, business analytics and data visualization. CRN also mentioned Pentaho’s new machine learning orchestration tools, available as a plug-in through the Pentaho Marketplace, to help data scientists, better monitor, test, retrain, and redeploy predictive models in production

 

We have registered over 1,500 licenses of Pentaho enterprise users. However, since Pentaho is open source, with a thriving community of open source users, there are hundreds of thousands of open source users and we are adding about 5K to 10K new users per week. While Pentaho positions us to have a place on this list, there is much more to what Hitachi Vantara can provide for big data and business analytics.

 

CRN’s report positions business analysis tools at the top of the big data tools pyramid to derive insight and value from the ever-growing volume of data. Hitachi Vantara focusses on the entire pyramid since the insights and value are only as good as the data that goes into it.

Pyramid.png

While Pentaho is a core product in our analytics portfolio, we have other analytic tools like:

  • Hitachi Content Intelligence is part of our Hitachi Content portfolio that automates the extraction, classification, enrichment, and categorization of data residing on Hitachi Vantara and third party repositories, on premise or in the cloud.
  • Hitachi Data Streaming Platform provides proactive data streaming analytics to transform streaming IoT data to valuable business outcomes.
  • Hitachi Video Analytics can drive new business success through insights into customer behavior and preferences.
  • Hitachi Infrastructure Analytics Advisor uses machine learning to prescribe optimal IT infrastructure performance SLAs to improve user satisfaction, simplify budget forecasting with predictive analysis, and accelerate fault resolution using AI to diagnose root cause analysis, prescribe resolution and enable admins to automate fixes.

 

Hitachi Vantara also has the good fortune to be part of a larger Global Hitachi corporation that has operational expertise in many industries, from healthcare, to energy, to transportation systems. This expertise is critical in developing industry or business specific analytic models and automation tools that drive business outcomes.

 

CRN put together this list of 30 business analytics companies for the following purpose:

 

“…we've put together a list of 30 business analytics software companies that solution providers should be aware of, offering everything from simple-to-use reporting and visualization tools to highly sophisticated software for tackling the most complex data analysis problems.”

 

Hitachi Vantara is proud to be recognized as one of the 30 Coolest Business Analytics Vendors by CRN Big Data 100. We congratulate the other members of this list. Since big data and analytics requires an ecosystem of vendors, I am sure that we will be working with many of these vendors as we are already working with vendors like Microsoft and SalesForce. We will be working with many more vendors and customers as we continue to develop the pyramid of big data tools that will be required to address our customer’s business requirements.

 

"Cool" wasn't in anybody's vocabulary 118 years ago, but the essence was captured in Harmony, Sincerity, and Pioneering spirit.

In order for companies to be more agile in responding to changing customer needs and market dynamics, they must have a storage infrastructure that makes data available at the right time, at the right place and in the right format, so that they can derive value from it and turn raw data into insights that drive business outcomes. With the explosion of data and the increasing demands on that data, data centers must focus more on the data and the information that can be derived from it than the storage infrastructure that supports it. In the past when data centers were considered as cost centers, the efficiency of data centers were measured by how many TBs of capacity could be managed by one full time employee. Now data centers are measured on how fast they can drive innovation in delivering data, information, and applications.

Cogniant.jpg

However, the storage infrastructure, is still very important. It must support many more development platforms and applications, must scale quickly to meet demand, comply with increasing availability and governance requirements, span the breadth of requirements from edge, to core, to cloud, and seamlessly incorporate new technologies in order to be on the leading edge. So, the question is how does a data center maintain a leading-edge storage infrastructure when the focus needs to be on business outcomes?

 

Storage infrastructure workloads can be reduced through a shared services or a managed services approach.A storage service with automated tools for configuration management, analytics for optimization, and central management of copies, clones, replicas, and backup, can greatly reduce operations and personnel costs and improve efficiencies. This will help data centers free up more resources to focus on development and applications as they transform into an information technology driver for business transformation.

 

Buying storage as a service breaks the traditional buying cycle, where you over buy enough storage capacity for the next 5 years at today’s price and technology when you know that both the price and technology are likely to undergo major changes over those 5 years. If your capacity is100 TB today and growing at a 20% CAGR, you have just bought 250 TB for the next five years when you only needed 120 TB for the next year. What happens in year three when you find that the business requirements have changed, and you need more or less capacity, or the performance of your storage puts you at a disadvantage to competitors with newer technology storage? Storage as a service provides the agility to buy what you need when you need it and frees you up to respond to future business requirements such as support for big data initiatives, smart products and services. Storage as a service helps enterprises meet business goals with improved commercial viability, optimized and efficient data management

 

Hitachi Vantara has partnered with Cognizant Technology Solutions, a world leading professional services company to bring industry leading storage as a service (STaaS) solutions to small, medium and enterprise companies to facilitate the transition of their data center models for the digital era. The Hitachi Virtual Storage Platform(VSP) scales from small to enterprise with enterprise ready software-defined storage, advanced global storage virtualization, and efficient workload consolidation. In addition to the VSP, Hitachi Vantara provides AI driven Hitachi Infrastructure Analytics Advisor to analyze and recommend improvements in operations, simplify budget forecasting and accelerate fault resolution, with Hitachi Automation Director to improve performance, avoid bottlenecks, and optimize end-to-end performance and resiliency, and Hitachi Data Instance Director to optimize data protection and copy data management.

 

The combination of Cognizant’s proven professional services leadership, bespoke solution designing, vertical business experience, deploying and managing large and complex storage infrastructure for customers coupled with Hitachi Vantara’s industry leading storage hardware and software solutions, will provide a new level of Storage Solutions that will enable sustainable IT transformation from Infrastructure to Information.

 

STaaS is an integrated software defined storage service that reduces cost, minimizes complexity, manages growth efficiently andhelps customers transition their storage environment securely from ‘Plugin-Migrate-Run-Retire’ model to consumption based ‘As-a-Service’ model

 

sappurpose-player-wide.jpg

This week I had the pleasure to be a panelist on SAP Radio. The Moderator was Bonnie D. Graham, Global Thought Leadership Media Director and the Creator / Producer / Host of SAP Game-Changers Radio. Other panelists included Karin Underwood, a first-year MBA student at the Stanford Graduate School of Business where she is a co-president of the Social Innovation Club and winner of the Impact Design Immersion Fellowship. She represented the views of the next generation of business leaders. Also, Katie Morgan Booth who leads Corporate Social Responsibility (CSR) for SAP North America, joined us to give her perspective from a large 10,000 employee technology and information company.

 

The question we were to address was “Can doing good using social innovation be good for your company’s bottom line?”

Serving the world.png

I pointed out the growing social challenges due to the increase in world population, mega cities, climate change, and dwindling resources, which provides an opportunity for corporations to develop new markets, enable new consumers, and create an environment that will enable social innovation and business growth.

 

Karin, the MBA student from Stanford, said that the notion that young people entering the workforce want to earn a lot of money and return it later is broken for many of her classmates. They want to find a way to do big, impactful things in the world early in their careers, and businesses have a huge opportunity to show their commitment. The nature of work has changed, and employees are voting with their feet. When businesses focus on only maximizing profits for their shareholders, they are making short-term choices that can hurt their ability to attract top-quality talent and to create economic value and returns for society in the long-term.

 

Katie, the CSR Director from SAP,  was concerned about the pace and scale at which innovation was accelerating and was concerned with its impact on people, organizations, and communities. We need to prevent people from being left behind. Inclusive education and workforce readiness programs are crucial to economic, social & environmental sustainability, as well as future innovation. We have the responsibility to meet people where they are and provide them with the hard- and soft-skills needed to secure employment in a digital workplace. We need responsive solutions and coordination from all parts of society – governments, citizens and private industry alike – to re-envision an educational system based on lifelong learning that can fully prepare workers for the jobs of the future. In CSR we have the unique opportunity to get our employees out into their communities, take them outside their comfort zones, and show them how others live, and the challenges that others face. When they see this, it often gives them a push to go further and deeper to learn about social issues and try to improve them. Many people take that spirit and challenge the status quo within our company, within their own jobs and products they work on. The more someone can be challenged the more they build empathy to other’s experiences, and that is what is necessary to move the world forward.

 

I can relate to Katie’s comments on CSR. Although I work on Hitachi’s Social Innovation Strategy and the different technologies that can drive innovation and create sustainable social change as well as business outcomes; it is important to keep myself grounded in what this means for individuals as well as the general society. This last weekend, I participated in a Relay for Life in my home town of Morgan Hill, just 30 miles south of where I work in Silicon Valley. Relay for Life is a volunteer fund raising event for the American Cancer Society. Teams are formed to walk in a 24 hour relay, around the local community park. I joined my daughter Elizabeth's team. Each team has a tent where they do fund raising events, like raffles, or sell different crafts that the teams have made. and distribute educational material about the different forms and treatments for cancer. More importantly it is a time to share and support each other. The teams are formed by neighbors, friends and families who come together to honor the memory of a cancer victim, support a cancer patient or to celebrate a survivor. We had the opportunity to hear firsthand, from cancer survivors and care givers, like the young mother who was first diagnosed with cancer 9 years ago, went into remission, but recently learned that the cancer had returned. Hearing these individual stories from the people in our community, adds urgency to everything we do.

FullSizeRender.jpg

The conclusion of our panel was that Social Innovation will be good for a company’s bottom line and if companies are to attract the new generation of innovative business leaders, they must focus their business strategies on more than maximizing profit. Social Innovation should also be a personal goal for each of us to build a healthier, safer, sustainable world for everyone.

 

The future is full of social challenges that will drive a need for Social Innovation. Addressing this need will require collaboration across government, business, and non-profits. In order for these innovations to be sustainable, corporations will need to integrate Social Innovation into their business strategy so that profits are tied to social innovation.

 

Many of the social challenges will come from an exploding world population. According to Wikipedia the world population grew from 2 billion in 1927 to 7.6 billion in 2018 and is expected to grow to 9.8 billion by 2050. The most rapid growth will be in countries with lower standards of living, and people in countries with higher standards of living will be living longer, putting a growing strain on health care and retirement systems. However, if the growth in population should decline, there may be even greater problems as fewer young workers struggle to support a burgeoning elderly population. Soylent Green is not the type of solution that we would like to contemplate.

soylent-green-year-1973-usa-director-richard-fleischer-movie-poster-EJ260B.jpg

More people will consume more resources, like water, energy, infrastructure, goods and services. Some of these resources are already very limited like clean water, carbon fuels, and the rare earth metals required for new technologies. More people will also create more pollution and waste which will lead to health issues, climate change, and food shortages. With climate change we will see an increasing pattern of floods, droughts, global warming and rising sea levels that threaten to inundate our coastal urban areas.

 

By 2050, 70% of the world’s population will live in cities, where problems like unemployment, slums, crime, homelessness, traffic congestion, sanitation, urban sprawl, and overwhelmed social services will increase. More and more people will migrate from rural areas seeking a better life but without the skills for urban employment, and refugees fleeing poverty and oppression in other countries will create challenges for integration into the mainstream society. Social problems will be especially challenging for mega-cities with populations over 10 million. For example, many of these mega-cities are hundreds of years old, built on antiquated underground water and sewage systems that are difficult to update without disrupting the infrastructure above it. Istanbul, a mega-city that is over 1000 years old loses an estimated 34% of their potable water due to leakage.

 

There can be tremendous opportunities for companies to achieve great profits in such densely populated areas. Many analysts predict that new markets like IoT could reach $1 trillion by 2025. This could create tremendous wealth for some and create a wider gap between the haves and have nots. Gentrification, the influx of more affluent residents into urban neighborhoods, can drive up the cost of housing and retail space, displacing poorer people and small businesses, creating an even wider gap. This will lead to alienation and discrimination which is the breeding ground for terrorism where individuals or a groups of individuals take violent action against the public to advance their political, religious, or ideological goals. Public safety will be a major concern for urban areas, particularly urban areas with high visibility.

 

There is a clear and pressing need to address social problems if we are to have a healthier, safer, and sustainable lifestyle for ourselves and our children’s children. Although many companies have a CSR (corporate social responsibility) program where they donate funds to social causes, sponsor charity events, and encourage employee participation in outside social initiatives, this does not go far enough. Wanting to do good is not enough. Sustainable social change can only happen when corporations integrate Social Innovation into their corporate strategy for delivering business outcomes. By helping to solve social challenges, a Social Innovation strategy will help corporations build a sustainable business model and long-term viability. They can do this by building new markets, strengthening supply chains with access to sustainable resources, investing in talent diversification, enabling new consumers, and helping to create a social environment that is conducive to business growth. Social Innovation includes the process of transforming an idea or invention into a solution that creates value for society and stakeholders.

 

Corporationss that are only focused on short term profit, may be first to market with new technologies, but may create even greater social problems. The late Stephen Hawkings warned that AI could be the worst event in civilization and he urged creators to employ best practices and effective management when creating AI. Hitachi integrates the principles of Kaizen in their approach to AI so that AI is used to empower workers rather than displace them.

 

Hitachi has always been conscious of the environment and the need for social innovation beginning in 1910 when Hitachi was established to build electric motors to improve the efficiency in mining operations.  In 2009, during the global economic crisis, Hitachi announced our strategy to strengthen our Social Innovation business. Hitachi announced that we would be “concentrating to build a more stable earnings structure with a focus on the Social Innovation business, which comprises social infrastructure supported by highly reliable and highly efficient information and telecommunications technology.” When you map that 2009 statement on Social Innovation business forward to today, you can see how it fits to our IoT strategy comprised of OT (social infrastructure) and IT (information and telecommunications) technologies. Through IoT, Hitachi will be able to address many of the social problems described above while improving our bottom line. 

 

Since then, Hitachi has delivered innovative solutions from clean water to smart cities that have addressed social problems and improved our bottom line so that we can continue to grow the company and invest in new Social Innovation projects. As our Hitachi Vantara CEO Brian Householder has described it, we are working to a double bottom line, delivering solutions and outcomes to benefit business and society.

Hitachi Future.png

While the acronym IT stands for Information Technology and is synonymous with the data center, in reality the focus of IT has often been more on infrastructure since infrastructure represented the bulk of a data center’s capital and operational costs. Digital transformation and the need for more agile business outcomes requires the transformation of IT from an infrastructure focus to an information and application focus.

 

Triangles.png

Digital transformation is not just about being more efficient in what we normally do. Digital transformation means turning everything upside down and changing our focus. Digital transformation for the data center means focusing on business outcomes through Information and applications that support your customer. The customer doesn’t care what vendor’s infrastructure you have or what you have to do to manage the infrastructure as long as he gets the service that he expects.

 

The figure above is not meant to suggest that Infrastructure is not important. On the contrary Infrastructure becomes more important since it needs to support a host of new and changing development platforms and applications and must be more agile and flexible, leveraging the latest technologies to meet changing business requirements. Data is the fuel that drives digital transformation and since data is persistent, (it will outlive the applications that created it and the technologies that store it) the choice of storage Infrastructure becomes even more important. In the figure above, I would place storage infrastructure at the tip of the spear when it comes to digital transformation. However, the question is how do you change your focus away from storage infrastructure when it is so important? The answer lies in a smart approach to data center modernization.

 

Data center modernization requires an agile data infrastructure, modern data protection, and intelligent operations.

3 Pillars.png

An Agile Data Infrastructureleverages the latest advances in technologies like flash, compression and dedupe, to scale performance to millions of IOPs and GB/S of bandwidth and scale capacities to petabytes with multiple millions of volumes and snapshots. An agile data infrastructure includes software that scales from small rack scale storage to multi-frame enterprise systems, consolidate block, file, and object storage, with support for virtualization, cloud and persistent storage for containers.  Virtualization is a key differentiator for consolidation of heterogeneous vendor storage systems and non-disruptive migration to future storage systems.

 

Modern Data Protectionprovides 100% data protection which is backed up by a written guarantee. It provides dual active storage systems for zero recovery time and zero recovery point availability and synchronous and asynchronous replication. Modern data protection eliminates the need for traditional backup and simplifies the management of copies. Modern data protection also includes security and privacy features that are designed in to protect the data whether it resides on edge devices, mobile devices, in the core or in the cloud. While Hitachi will do everything to protect your data, we want you to own your data – not us. Where encryption is required, we give you control of the encryption keys.

 

Intelligent Operations, will enable you to harness the power of data to improve and automate operational efficiency, anticipate customer demand, and generate new revenue streams. As the demands on the data center increase, operational efficiencies can deteriorate and exposure to down time and data loss increases. An AI powered brain is needed to provide deeper data center insights, by looking across the entire data path from virtual machines, servers, networks and storage, using machine learning to optimize, trouble shoot, and predict data center needs. AI can be integrated with an automation engine that can orchestrate the delivery and management of IT resources to free up data center operations staff to work on information and applications that support business requirements. The automation tool should also integrate with other infrastructure services: IT service management tools like ServiceNow®for tracking and control of IT resource delivery, REST API for provisioning of third party resources including storage, and data protection tools like Hitachi Data Instance Director to protect against data loss and downtime.

 

 

New Enhancements to Hitachi Vantara Data Modernization Offerings

Hitachi Vantara recently announced enhancements to its Agile Data Infrastructure and Intelligent Operations portfolio.

 

The Agile Data Infrastructureportfolio includes new enterprise-class Hitachi VSP models include the all-flash VSP F700 and VSP F900 and the hybrid flash VSP G700 and G900 systems. To reach a broader range of customers, Hitachi is introducing new midrange models. They include the VSP F350, F370, G350 and G370 systems. The systems are powered by the next generation of Hitachi Storage Virtualization Operating System, SVOS RF, which has been re-architected f to deliver the following improvements:

  • Up to 3x IOPS improvement
  • 25% lower latency
  • 3.4x faster performance with data reduction enabled
  • Up to 2.5x greater capacity scalability, 8x more volumes and 1 million snapshots
  • Modern workload support including plug-ins for Dockers and Kubernetes containers
  • Backed by 4:1 data efficiency guarantee

For Intelligent Operations, Hitachi Vantara has integrated and enhanced its AI operations software portfolio to ensure the highest return on data center investments and to accelerate strategic outcomes. The new integration of Hitachi Infrastructure Analytics Advisor(HIAA), Hitachi Automation Director(HAD) and Hitachi Data Instance Director(HDID) simplifies data center management and sets the foundation for autonomous operations across the data center.

  • HIAAimprovements:
    • Predictive analytics for better forecasting future resource needs
    • AI driven heuristic engine for recommending fixes and repairs up to 4x faster
    • HIAA integration with HAD to automate implementation of HIAA recommendations
  • HADimprovements:
    • Automated configuration of Hitachi Data Instance Director (HDID)
    • Automated configuration of VSP QoS
    • Integration with IT Service Management (ITSM) tools, including ServiceNow, for improved tracking

 

To see how this new generation of all flash and hybrid flash Hitachi Virtual Storage Platforms with its next generation Hitachi Storage Virtualization Operating System and the integration of Hitachi Infrastructure Analytics Advisor with Hitachi Automation Director and Hitachi Data Instance Director will enhance our modern data center portfolio and accelerate the transformation of the data center; see the following announcement letter and video:

 

https://www.hitachivantara.com/en-us/pdf/datasheet/vsp-g-series-hybrid-flash-midrange-cloud-solutions-datasheet.pdf

Recently I had the pleasure to meet with a group from KT, South Korea’s largest telephone company. It was very exciting to hear about their experiences at the Pyeongchang Winter Olympics where they partnered with major companies like Intel and Samsung with their 5G wireless network technology to deliver the most high-tech Olympic games in history! With the world’s telecom giants racing to unveil the world’s first 5G, KT was the first to provide a large-scale pilot service and show case it at one of the world’s most public venues, the Winter Olympics.

 

Inntel drones.png

 

The Pyeongchang games opened with dazzling display of a record setting 1,218 Intel drones with onboard LEDs, joined and orchestrated through 5G connectivity to a central computer. Intel provided live or on-demand VR coverage of 30 events, a project powered by the 5G network. Between 3 and 6 camera ‘pods’, each containing 12 x 4K video cameras were used for events such as speed skating, alpine skiing and bobsleigh. These cameras generated as much as 1TB of data per hour. While Gigabit Wi-Fi could have provided this speed and capacity, 5G provided ubiquitous coverage across the venue and real time control because of its low latency. In KT’s 5G pilot, a video demonstrated the speed and capability of 5G, using real-time, 360-degree video of athletes competing, displayed on a Samsung 5G display. It’s a feat that would be impossible on current 4G technology without buffering. Although the peak speeds for 5G could be 20 times faster than 4G. The 5G speeds reached in these trials were 4 time faster than 4G which still allowed for crisp streaming of the Game’s action from all angles.

 

In addition to video streaming and VR, other 5G enabled communications use cases included: Artificial Intelligence (AI) enabled robots to help inform and entertain fans and athletes in Korean, Chinese, Japanese and English. Self-driving busses served thousands of fans with safe, efficient transport between venues. The speed of the 5G network enabled them to receive information in real time from a central control center, helping them to avoid obstacles or crashing into other vehicles. The busses were also able to download and display 3D video files on transparent screens giving fans a front row seat even before they arrived at their chosen event.

 

Samsung equipped two Dutch speed skaters with smart suits with sensors positioned over the material to feed live body position data to the skaters’ coaches. Coaches were able to analyze their racers’ posture and make improvements using coded signals to a communications device on the skater’s wrist. While these were used only in training and not in competition, the Dutch team won Gold medals in seven of the ten individual speed skating events and four medals in short track speed skating. Over the years the Netherlands has built a dedicated culture and athletic infrastructure for producing speed skating superstars. What improvements could be possible with 5G technology in their training regimen?

 

This successful showcase of 5G technology will help to accelerate the development of standards and the delivery of commercial 5G networks in 2020 or sooner. While 4G wireless mobile technology revolutionized the consumer market and boosted the use of cloud, 5G is set to transform the edge for IoT and Industry 4.0. 5G is the fifth generation of wireless communications technology that will enable new kinds of users on the edge that require very low latency, low power consumption, low cost, high reliability, and exponentially higher data loads.  The sharing of information among systems like smartphones and robots will be enabled by 5G as well as bi-directional M2M communications which will provide data services that will differ from those that offer voices services for mobile phone users. Vertical industries like e-commerce, manufacturing, medicine, automotive, oil and gas, and logistics, are developing transformational business propositions on top of 5G.

 

Congratulations to South Korea for their success in delivering an amazing, high tech, Winter Olympics and to KT for providing the first large scale pilot of a 5G network which will help to advance the commercialization of this technology and unleash the full potential of IoT and Industry 4.0 applications.

 

Here is a short video of the amazing opening drone display at the Pyeongchang Winter Olympics

According to Wikipedia, Serverless computing is a cloud computing model in which the cloud service provider dynamically manages the allocation of machine resources. Serverless computing still requires servers. The name "serverless computing" is used because the server management and capacity planning decisions are completely hidden from the developer or operator. Serverless code can be used in conjunction with code deployed in traditional styles, such as micro services and run in containers. The drawing below illustrates what the cloud service provider provides and what the user developer or operator provides with serverless computing compared to an Infrastructure as a Service.

serverless.png

 

Serverless computing is provided by a cloud service provider like AWS Lambda. To use it you write code (in C#, java, Node.js or Python), set a few simple configuration parameters, and upload everything (along with required dependencies) to Lambda. This package is now what Lambda calls a function which can be automatically triggered from other AWS services or called directly from a web or mobile app. Serverless computing Lambda takes care of everything, all you do is provide the code and Lambda deploys the function in a container and provides everything required to run and scale the function with high availability. Lambda persists the container until the function has done its job, then disappears. Serverless computing is used with containers.  Another way to describe serverless computing would be Function as a Service or FaaS. AWS Lambda was introduced in 2014, and since then other cloud provides have rushed to provide similar capabilities.

 

The benefits, according to AWS includes no servers to manage, continuous scaling precisely to the size of the workload by running code in parallel based on processing individual triggers, and sub second metering where you are charged every 100ms that the code executes, and you don’t pay when your code is not running. Serverless computing is inexpensive. Serverless computing uses containers but does not need to deploy and manage the containers. It is low maintenance since you do not need to provision containers, set system policies and availability levels or handle any backend server tasks. The standardized programming environment and the lack of server and container overhead means that you can focus on writing code.

 

Serverless computing has some very definite limits. You are limited to the implementation constraints of the cloud service provider. For example, Lambda has built in restrictions on size, memory use, and time available for a function to run. There is also a limited list of natively supported programming languages and it is important to keep functions small since a few high demand functions can overload or lock everyone else out. Serverless computing runs in a multi-tenant environment so there is always exposure to speed and response time variations and outages due to the demands or bad behaviors of other tenants. Monitoring, debugging, and performance analysis capability may also be restricted due to lack of visibility into the backend services provided by Lambda. Since your software is hardwired into the providers (Lambda) interfaces, there is vendor lock in. However, if you Google serverless computing and vendor lock in you will see many arguments for the benefits, pro and con. 

 

So, what are the use cases for serverless computing? The best functions are short lived, small, and do not run for a lengthy period of time. Some functions that could fit this model are real time analytics that are triggered by anomalies in a data stream, ETL to perform data validation, filtering, sorting and other transformations before it loads the transformed data into another data store, and as the backend for an IoT application where sensors trigger the need for a spare part and a function automatically places the order. Here are a few uses cases that AWS Lambda proposes:

Lambda Use cases.png

In some ways serverless computing is the next abstraction beyond containers. A container provides the user with more control and portability, but that also comes with more administration.The main benefit of a container is that it consists of an entire runtime environment: an application, plus all its dependencies, libraries and other binaries, and configuration files needed to run it, are bundled into one package; it can run reliably when moved from one computing environment to another. Containers let developers deploy, replicate, move, and back up a workload even more quickly and easily than you can do using virtual machines. Container based application can be as large and complex as you need it to be. It would be easier to redesign a monolithic application into container based micro services than if you tried to redesign it with serverless computing due to the multiple bottlenecks based on size and memory constraints.

 

Serverless computing is also compared with micro services. Micro services are a change in architecture where a single monolithic application is broken down into a set of self-sustained small services running on their own machines (or instances.) They use light weight mechanisms like REST interfaces to allow communication into the micro services. Micro services can be reused in different applications, eliminating the duplication of work effort when the same service may be required by different applications. Micro services have an operational overhead which serverless computing does not have. It requires an underlying operating system which requires deployment and monitoring of the operating system for availability. There is also application deployment and configuration overhead and ongoing support and maintenance. With serverless computing, you leave that all to the cloud provider and you only pay for the time of use in 100ms increments. On the other hand, the advantage of micro services with containers is full control of the environment, while with serverless computing you are limited to what the cloud service provider enables for you.

 

Serverless computing services, micro services, and containers are not competing systems. They are complimentary. Serverless computing is another computing model that should be considered to increase agility and efficiency in code development and application deployment.

This week I had the opportunity to hear Michael Sherwood, Director of Technology and Innovation for the city of Las Vegas talk about the IoT innovations that he helped implement there. For Michael, IoT is less about the technology and more about the outcome in terms of making the city safer, smarter, healthier and saving money for the city through greater efficiencies. The bottom line drives the story and like Hitachi, he works to a double bottom line. One is the business bottom line and the other is the bottom line for society. This fits perfectly with Hitachi’s vision for social innovation.

 

Some of projects that he has been able to implement are a self-driving shuttle, smart Intersections, smart trash collection, and a smart edge network.

 

HOpOn.png

 

The self-driving shuttle project is called “Hop On”. The autonomous vehicle, which launched last November, is limited to eight passengers as it travels along a 3/5-mile loop in downtown Las Vegas. Locals and tourists may ride for free. If you have been to Las Vegas you’ve experienced the long walks between the hotels, where it doesn’t justify the cost of a taxi or Uber. However, walking between hotels can be exhausting. The city blocks are long, and the proportions of hotels are so gigantic that the perception of distances are distorted. The first glitch with Hop On occurred within hours of its launch. A semi truck backed into the shuttle before it could go into reverse to avoid the collision. Fortunately no one was hurt. The first lesson learned was to put a horn on the shuttle to warn other drivers.

 

Sensors and cameras at intersection can monitor the flow of traffic and optimize traffic signals to minimize the wait time for vehicles and pedestrians, cutting down on carbon emissions from idling cars. Cameras can monitor public places not only for safety, but also for trash collection. Now instead of scheduling trash on a time table, whether it was required or not, the trash can be collected on demand, freeing up workers to do other tasks like graffiti removal.

 

Las Vegas.png

 

Las Vegas owns the majority of its streetlights and the city is upgrading these fixtures to create an intelligent platform not only for lighting but also as a security and communications network. Each upgrade includes multi colored led lights, a module for a security camera system and options for fibre, WiFi, and cellular connectivity. Other modules that maybe included are gunshot detection and LIDAR (Light Detection and Ranging) that uses pulsed lasers to measure distances. When multiple streetlights are connected together, they can create a mesh network to enable automated actions like light driven alerts that are managed locally rather than requiring a connection back to a cloud.

 

When asked what Hitachi platforms are involved in these projects, he mentioned Hitachi’s Visualization Suite and Pentaho. Hitachi’s HVP camera solutions empower edge recording and compute, which is a key component to addressing network efficiency and policy for video storage/access. Michael emphasized the need for analytics which he believes will be the next big thing over the coming 5 years.

 

Michael was also asked if it is hard to attract talent to Las Vegas to work on these projects. Michael said that he did not have a problem in attracting the right people. The talent is drawn to these projects in Las Vegas, not for the money but for the vision and a chance to be part of something that is truly innovative.

Custom logo[3][1].png

Digital transformation revolves around our ability to modernize our data center because that is where our data resides. The data center is no longer a physical location. It extends beyond the walls of the enterprise, to the cloud, and to the edge where new data is being generated and analyzed. Data holds the key to our success in this new digital era.

 

The key requirements for Data Center Modernization are an agile data infrastructure that is cloud aware and container integrated, data governance which ensures that data is continuously available and adheres to compliance objectives, operation intelligence to provide deeper insights, and automation to optimize and accelerate innovation.

 

Many of our customers are well on their way to modernizing their data center. An example is Rabobank, a banking and financial services company that serves 10 million customers in 47 countries. Like all financial institutions, Rabobank is subject to a wide range of strict government regulations in each of the countries in which it operates. Previously, IT gathered communication data from multiple sources in each country, which might include email, recorded voice calls, instant messages, and chat applications. Just gathering the data was time-consuming, resource intensive, and error-prone. Analyzing the data for compliance was even more time consuming since IT would have to mine multiple data sources spread across many disparate silos; some from third party vendors, and some held on backup tapes. There was no way to associate one communication type with another and if the enquiry changed they would have to search the same systems over again. The compliance division had to wait for IT to service each request for data before they could begin their work.

 

Rabobank needed to modernize their data center by creating a solution that would collect every piece of relevant data into a central, comprehensive data set where the data could be correctly managed and governed with compliant access control, audit trails, and automated policy-driven deletion. Rabobank engaged Hitachi Vantara to build this centralized data platform on Hitachi Content Portfolio (HCP). With HCP, Rabobank has simplified access to the data that they need for compliance investigations. They have significantly improved the efficiency and flexibility of their investigations and cut the discovery time from weeks to hours. Investigators get access to the data they need from their desk without having to ask for help from IT. As a result, the IT team was released from administrative data search and retrieval tasks to focus on more proactive tasks.

 

Hitachi Content Portfolio (HCP) is an object storage solution that enables IT organizations and cloud service providers to store, share, sync, protect, preserve, analyze and retrieve file data from a single system. HCP automates day-to-day IT operations like data protection and readily evolves to changes in scale, scope, applications, storage, server and cloud technologies over the life of data. Tightly integrated with HCP, Hitachi Content Intelligence addresses the challenges of exploring and discovering relevant, valuable and factual information across the growing number of data producers and siloed repositories that plague organizations today. By aggregating multi-structured data, Content Intelligence enables insights to be surfaced faster, data management and governance to be more complete, and to understand the distribution of organizational data based on its value to the business.

 

The Hitachi Content Portfolio was recently recognized by the Business Intelligence Group with the 2018 Fortress Cyber Security Award for Regulatory Compliance. The differentiation in the nomination was the addition of Hitachi Content Intelligence to the Hitachi Content Portfolio, making it the industry’s only integrated object storage portfolio offering sophisticated search and analytics capabilities. The new intelligence solution rounds out the HCP portfolio, which already offers a seamlessly integrated cloud-file gateway and enterprise file sync and sharing and continues to improve an organizations’ ability to strategically manage data. More than 2,000 customers have already adopted HCP as a key component in their digital transformation journey. With Hitachi Content Intelligence, these customers can transform data into relevant business information and deliver it to the right people, when it matters most.

 

The HCP portfolio provides the agility, governance and operational efficiency to modernize your data center.

Hu Yoshida

Women of Hitachi Vantara

Posted by Hu Yoshida Employee Apr 4, 2018

The Silicon Valley Business Journal recently announced their list of the 2018 Women of Influence. We were pleased to see them recognize our CIO,  Renée McKaskle.  Renée is representative of the many women who have a significant role in driving business and social outcomes at Hitachi Vantara. For this blog post I thought that I would mention just a few of the women of Hitachi Vantara that I work with on a regular basis. There are many more across the different business units and geos within Hitachi Vantara and many of them share the same characteristics as these four women. They know how to innovate, lead, and give back to others.

 

renee.png

Renée McKaskle is senior vice president and Chief Information Officer of Hitachi Vantara and is responsible for developing and implementing information technology initiatives that align with the Hitachi Vantara mission. Since she came in as our CIO, over two years ago, she has digitally transformed what was a typical IT organization with its many silos of operations into a dynamic, Agile team focused on business outcomes. Now with cross functional teams, including business as well as IT functions, working together, using iterative Agile sprints; IT can focus on relevant business outcomes and deliver it more efficiently. We are partnering for data centers and offer IT as a Service to our business units and to sister Hitachi companies in the Americas out of a shared facility in Denver. An example of the effectiveness of this new Agile approach to IT occurred last September, when we integrated Hitachi Data Systems, Hitachi Pentaho, and Hitachi Insights group into Hitachi Vantara, and were able to integrate and switch over our electronic systems and service desks worldwide as we followed the sun across the Geo’s, in less than 30 hours!  Renée is not only actively engaged in our overall corporate digital transformation, she also meets frequently with our customers to share experiences with their CIOs and provides feedback to our marketing and development teams so that we understand the challenges that they face.

 

Prior to Hitachi Data Systems, she held CIO positions at SC Johnson and Sons and Symantec Corporation and Senior IT leadership roles at Oracle and PeopleSoft. She holds a bachelor’s degree in economics from UCLA, and a master’s degree in information systems from the University of Texas.  Renée says that she was very privileged when she was growing up since no one ever told her that there were any limitations to her aspirations. She is a strong advocate of STEM (science, technology, engineering and mathematics) programs and careers for girls and women to give them the confidence to do what inspires them.

 

Mary Ann.png

Mary Ann Gallo, Chief Communications Officer for Hitachi Vantara, leads the global organization responsible for driving the Hitachi Vantara Brand. Mary Ann began her career as a radio and television reporter/anchor and leverages the real time, storytelling skills that she developed there to engage people in her successful branding efforts. One of her recent accomplishments has been in leading the transformation of the former Hitachi Data Systems brand along with the integration of Hitachi Pentaho and Hitachi Insights brands into what is now Hitachi Vantara. Forming an Agile team across all the functions in these three organization and our parent Hitachi company, her team was able to accomplish the rebranding in 6 months! It would take most companies over a year to go through this process, but with a nimble approach, with clear direction, strategic milestones, small sprints and a committed team, the transformation came together quickly and smoothly.

 

Mary Ann has 20 -plus years in marketing and journalism fields in technology companies with executive roles at companies like VMware and Edelman. She has completed executive leadership programs at the Dartmouth Tuck School of business, IMD Business School in Lausanne, Switzerland and the WOMEN Unlimited Lead Program. She is an avid proponent of diversity issues and founded the Women of Hitachi program. Outside of work Mary Ann serves on the Board of Advisors for U-Jam Fitness, an athletic dance fitness program featured at fitness centers around the world. She is also the mother of two teenage daughters.

 

Linda.png

Ke (Linda) Xu is the vice president of Emerging Solutions and Cloud Services Marketing for Hitachi Vantara. Linda and her team define the Go-To-Market strategy and execution for object storage, file sync and share, content intelligence, data protection and cloud services and solution. She also has responsibility for vertical markets like Financial Services. That is a lot of territory to cover but Linda is a problem solver and is very versatile in addressing new markets and technologies. At one time when we had a major change in our Hitachi Academy, she stepped in and managed the transition in addition to her regular job in product marketing. She is also the co-author of Cloud Storage for Dummies. All these accomplishments are amazing to me since she was born and raised in China before she came to the U.S. to earn an MBA in Finance and Marketing at the University of Michigan Ross School of business.

 

She began her career In China as a Public Relations Manager for AT&T China. Today, she is widely sought after as a key note speaker at business conferences, especially in Asia Pacific where women look to her as a role model. Linda tells me that what motivates her is to be a role model for her daughter. She learned so much from her mother who accomplished her career ambitions while playing an active role when Linda grew up, loving, determined, and supportive. That shaped Linda into who she is today, and she hopes that her daughter can feel the same way when she grows up. Like many of the women of Hitachi Vantara, while work is important, their families are their priority.

 

Ana.png

Technology companies usually hire experienced new hires who can step in and be productive, day one.  Ana Sanchez joined us as an intern in 2013 and we hired her right out of the University of San Diego in 2014. She started in sales, and is currently a Global Partner Marketing Manager, leading global programs and business development opportunities with Hitachi’s key alliances and top technology companies such as Intel and SAP. She has been a seven-time recipient of Hitachi’s Kaitakusha Seishin (Pioneering Spirit) Award, 2013-2018, for her innovative pioneering spirit in developing and leading business development programs. In 2015 she was recognized as part of Forbes Under 30 Network and contributed to Forbes’ 7 Habits of Successful Under 30’s.

 

As impressive as her contributions are to the business, she has taken a leadership role in working to develop others. Ana is a co-founder of Women of Hitachi Mentorship Program which is based on propelling women toward careers in STEM (Science, Technology, Engineering, Math). This program now serves women representing over five Hitachi Companies across every geography. She is a Core team member, women of Hitachi Board, working to extend the Women of Hitachi beyond its current state. She is also on the Scholarship Board of the University of San Diego which is focused on identifying candidates based on their commitment to making a longstanding impact at USD. Ana is representative of the younger generation of women, committed to Social Innovation, and taking an early leadership role in driving business outcomes.

 

While Hitachi is a well-established company, over 107 years old, Hitachi Vantara is less than 1 year old, formed to help Hitachi deliver transformative digital solutions to our customers as we expand beyond cloud to IoT. Key to our success will be the culture that we develop as Hitachi Vantara. Diversity needs to be a foundation for this culture and the women of Hitachi Vantara are putting their stamp on this foundation.

Deloitte Insights and Forbes Insights are sponsoring an upcoming webinar: The Fourth Industrial Revolution is Here - Are you Ready? on Thursday, April 12 at 02:00PM EDT (11:00AM PDT). (Click here to attend and calendar this event) At this webinar you will have the opportunity to interact with Dr. Mark Cotteleer, Managing Director of Deloitte's Centre for Integrated Research and Brian Householder, Chief Executive Officer of Hitachi Vantara as they discuss the key findings from a joint Deloitte and Forbes Insight survey of more than 1,600 worldwide executives on their Industry 4.0 Readiness.

 

The First Industrial Revolution used water and steam power to mechanise production, The Second used electric power to create mass production, The Third used electronics and information technology to automate production. The Fourth industrial Revolution is building on the third and is the fusion of new technologies that are transforming the entire systems of production, management, and governance.

 

Some questions that Dr. Cotteleer and Brian will discuss are:

  • What do executives see as their role in making the world a better place in the age of Industry 4.0?
  • How are executives using Industry 4.0 to create new value?
  • How are executives readying their workforce for these changes?
  • Do executives see Industry 4.0 technologies as a toolset to improve business as usual or enable new business models?

It will be very interesting to see the results of the Industry 4.0 survey and hear Brian's take as the CEO of Hitachi Vantara, a company that is uniquely positioned in the integration of operational technology and information technology for industry 4.0 and social innovation.

Nowhere are regulations more complex and difficult to comply with, than in the Financial Services sector where levels of regulations and the focus on data and reporting constantly increases. New business models, driven by FinTechs, technology driven financial companies, are disrupting the regulatory practice with innovative new ways of doing things. This disruption to regulatory practices will require the acquisition of more data, involve the use of real-time information, and incorporation of new algorithms and analytics and the re-evaluation of older regulations in terms of implementations of technology. Financial institutions are turning to RegTech companies that use new technologies to facilitate the delivery of regulatory solutions that are configurable, easy to integrate, reliable, secure and cost effective.

 

However, even with the help of RegTechs, the current regulatory system is full of inefficiencies and ambiguities, requiring significant interpretation on the part of legal and compliance personnel who are responsible for ensuring that their organization is compliant. The figure below illustrates where the pain points occur.

MDMERR .png

While RegTechs are helpful on the backend helping organizations interpret and implement regulatory solutions, it would be much more efficient if the regulations could be implemented in a model -driven, machine executable form. Although not suitable for all types of regulation (particularly those that govern conduct or relate to higher level principles, rather than detailed rules), machine executable regulation could be applied to a meaningful amount of regulation and could therefore facilitate a significant degree of ‘straight through processing’ for regulated organizations.

 

Creating machine executable regulation starts with the regulators. The UK’s financial regulators, the Bank of England (BoE) and the Financial Conduct Authority (FCA), have proposed a Model-Driven, Machine Executable Regulatory Reporting (MDMERR) approach. This could allow for complete disambiguation as it will push regulators to work out the precise meaning of rules in advance rather than in retrospect during the enforcement of the organizations proposed compliance. Benefits of the MDMERR approach could include:

 

Clarity: Machine-executable regulation will not be as ambiguous as natural language regulations and could be executed with little or no human supervision. The long term benefits of reduced regulatory risk may enable increased investment in financial innovation.

 

Time Efficiencies: An MDMERR could greatly reduce the time to create, update and implement regulatory policy by allowing regulators (and regulated firms) to test proposals in near-real time. MDMERR could also allow for more cost efficient real-time monitoring of compliance by the regulator and the firms themselves, so perhaps reducing the many meetings, memoranda and other delays in the traditional enforcement process.

 

Cost Efficiencies: From the private sector perspective it would decrease the need for the time-consuming effort to remove uncertainty and doubt in the interpretation of regulations.

 

Change Management: The MDMERR model would enable regulators to more efficiently distribute changes and reduce the long lead times for the financial institutions to adapt to some of these changes. Since the changes would be released in machine readable formats, the financial institutions would update their systems (and controls) without the need for additional interpretations by the attorneys.

 

This all sounds very good but is this practical given the complexity of the regulatory and compliance process? In order to test the feasibility of this approach the FCA and BoE enlisted experts within the public and in the private sector to create a proof of concept model and test it against real data. In November of 2017,  Hitachi, led by Nirvana Fardhi, our Hitachi Vantara Global Head of Financial Services RegTech, joined with FCA and BoE to sponsor a two week TechSprint to develop a “proof of concept”  which could make regulatory requirements machine readable and executable.

TechSprint.png

The TechSprint successfully proved that a regulatory requirement contained in the FCA Handbook could be turned into language that machines could understand and use to execute a regulatory requirement. For a brief 2:16 min video of the highlights from this TechSprint please click here.

 

While this “proof of concept” was ground breaking, there is still a lot of work to be done to transition to a market-ready solution. For example, how to: address the risks of the accuracy of machine-executable interpretations, any errors in the code base, lack of flexibility, increasing complexity in the code base as the technical infrastructure increases, versioning challenges, opportunities for abuse, and security.

 

Follow-on work to the MDMERR TechSprint presents an interesting opportunity to test a data-based approach to representing law as MDMERR. Such a data driven approach, including the use of machine learning to translate and ensure consistency across rules in the code base, could help in the translation of text and identify potential errors along with many other uses. The FCA published a Call for Input in February to ask for views on how these ideas could be developed further and will publish its findings and next steps in the summer of 2018. In addition, the FCA is working with several banks to take this forward into a production environment within the next six to nine months

 

A major step forward was taken this month when the UK Chancellor of the Exchequer, the Rt. Hon. Phillip Hammond MP, published the Fintech Sector Strategy and publicly endorsed MDMERR giving this project official UK Government support. Nirvana and her team continue to engage actively with the FCA and Bank of England and Hitachi Vantara looks forward to taking an active part in this initiative.

 

"The announcement by Chancellor Phillip Hammond heralds the next stage of machine executable regulatory reporting. Hitachi Vantara is not only an early innovator in this process but is committed to helping drive forward such revolutionary change in financial services and beyond. “

 

            Nirvana Farhadi

Hitachi Vantara

  Global Head - Financial Services RegTech

There is a good probability that you or someone you know has diabetes. The World Health organization believes that an estimated 8.8 percent of the adult population worldwide had diabetes. This figure is projected to rise to 9.9 percent by the year 2045. Type-2 diabetes is the most prevalent form of diabetes and affects more people as the population ages. Today one in every four Americans, 65 years or older has Type-2 diabetes. The spread of Western lifestyles and diet to developing countries has also resulted in a substantial increase.

 

Word Wide Diabetes.jpg

 

Diabetes is a chronic, incurable disease that occurs when the body doesn't produce any or enough insulin, leading to an excess of sugar in the blood. Diabetes can cause serious health complications including heart disease, blindness, kidney failure, and lower-extremity amputations and is the seventh leading cause of death in the United States. Diabetes can be controlled by medication, a healthy diet, and exercise.

 

The problem with medication is that there are many ways to treat the disease with different combinations of drugs. Some medications breakdown starches and sugars, others decreases the sugar your liver makes, some affect rhythms in your body and prevent insulin resistance, others help the body make insulin, still others control how much insulin your body uses, some prevent the kidneys from holding on to glucose, and others help fat cells to use insulin better. New medications are being developed continuously as the population of diabetics increases.  Diabetics often need to take other medications to treat conditions that are common with diabetes like heart health, high cholesterol, retinopathy, and high blood pressure. The efficacy of the drugs changes with the patient's age and other physical factors. There are also different side effects depending on the individual’s situation, and the drugs can be expensive. The effectiveness of the treatment is measured every three months by a blood test for a measure called A1C. A1C measures the average blood glucose level for the last 3 months. An A1C measure of 7.0% indicates that the blood glucose level and the diabetes are under control. However, 7.0% is an ideal reading and higher readings may be acceptable depending on the individual. Up to now the prescription of medication is usually a trial and error approach and more than half of diabetes patients fail to achieve the treatment targets according to the World Journal of Diabetes. The selection and monitoring of the most effective medication or combination of medications that is also safe, economical and better tolerated by patients is often hit or miss.

 

On March 12, Hitachi and the University of Utah Health, a leading institution in electronic health records and interoperability clinical information systems research announced the joint development of a decision support system that allows clinicians and patients to choose pharmaceutical options for treating type-2 diabetes. The system uses machine learning methods to predict the probability of a given medication regimen achieving targeted results by integrating with electronic health records which allows for guidance that is personalized for individual characteristics.

 

Diabetes ML Model.png

The system compares medication regimens side-by-side, predicting efficacy, risks of side effects, and costs in a way that it is easy for clinicians and patients to understand.

 

Diabetes Dashboard.png

Using Machine learning combined with the individual’s individual health records will increase the probability of selecting the right combination of medications that will help individuals reach their targeted goals to control diabetes. Think how this could be applied to other treatments like chemo therapy for cancer. If you know anyone with diabetes please forward this post to them so they can understand what is possible when you apply machine learning to the control of this disease.

Last week Hitachi Vantara Labs announced Machine Learning Model Management To accelerate model deployment and reduce business risk. This innovation provides machine learning orchestration to help data scientists monitor, test, retrain and redeploy supervised models in production. These new tools can be used in a data pipeline built in Pentaho to help improve business outcomes and reduce risk by making it easier to update models in response to continual change. Improved transparency gives people inside organizations better insights and confidence in their algorithms. Hitachi Vantara Labs is making machine learning model management available as a plug-in through the Pentaho Marketplace.

 

Machine learning explores the study and construction of algorithms that can “learn” from and make predictions on data through building a model from sample inputs without being specifically programmed. These algorithms and models become a key competitive advantage – and potentially a risk. Once a model is in production, it must be monitored, tested and retrained continually in response to changing conditions, then redeployed. Today this work involves considerable manual effort and is often done infrequently. When this happens, prediction accuracy will deteriorate and impact the profitability of data-driven businesses.

 

David Menninger, SVP & Research Director, Hitachi Vantara Research, said, “According to our research, two-thirds of organizations do not have an automated process to seamlessly update their predictive analytics models. As a result, less than one-quarter of machine learning models are updated daily, approximately one-third are updated weekly and just over half are updated monthly. Out-of-date models can create significant risk to organizations.”

 

So, what is Machine Learning Model Management and where does it fit in the analytic process?

Machine Learning Model Management.png

Machine Learning Model Management recognizes that machine learning models need to be updated periodically as the underlying distribution of data changes and the model predictions become less accurate over time. The four steps to Machine Learning Model Management include, Monitor, Evaluate, Compare, and Rebuild as shown in the diagram above. Each step implements a concept called “Champion/Challenger”. The idea is to compare two or more models against each other and promote the one model that performs the best. Each model may be trained differently, or use different algorithms, but all run against the same data. These 4 steps to Machine Learning Model Management is a continuous process and can be run on a scheduled basis to reduce the manual effort of rebuilding these models.

 

Hitachi Vantara’s implementation of Machine Learning Model Management is part of the Pentaho data flow which makes machine learning easier by combining it with Pentaho’s data integration tool. In the diagram above the preparation of data may take 80% of the time to implement a model with preparation processes that rely on coding or scripting by a developer. Pentaho Data Integration empowers data analysts to prepare the data they need in a self-service fashion without waiting on IT. An easy to use graphical interface simplifies the transformation, blending, and cleansing of any data for data analysts, business analysts, data scientists, and other users. PDI also has a new capability that provides direct access to various supervised machine learning algorithms as full PDI steps that can be designed directly into your PDI data flow transformations.

 

For more information on PDI and how it integrates with Machine Learning Model Management see the following blog posts by Ken Wood.

            Machine Intelligence Made Easy

            4-Steps to Machine Learning Model Management

Back in November 2014 I posted a blog on how “Controlling your explosion of copies may be your biggest opportunity to reduce costs”. I quoted a study by Laura DuBois of IDC which reported that 65% of external storage systems capacity is used to store non-primary data such as snapshots, clones, replicas, archives, and backup data. This was up from 60% just a year earlier. At this rate it was estimated that by 2016 the spend on storage for copy data would approach $50 billion and copy data capacities would exceed 315 million TB. I could not find a more recent study, but I would estimate that the percentage may have increased due to more online operations, ETL for analytics, DevOps, and the larger number of shorter lived applications which tend to leave dark data behind that never gets cleaned up. Copies serve a very useful purpose in an agile IT environment, just like the massive under water bulk of an iceberg provides the displacement that keeps the iceberg afloat. However, the copies need to be monitored and managed and automated to reduce costly waste and inefficiencies.

 

iceberg.png

 

At that time in 2014, our answer for Copy Data management was a product called Hitachi Data Instance Manager which came from the acquisition of the Cofio Aimstor product. Most users at that time were using this product as a lower cost backup solution. A key feature was a workflow manager with settable policies for scheduling the operations it controlled. Since that time Cofio and Hitachi Engineers worked to provide the latest enterprise features into this product and renamed it Hitachi Data Instance Director or HDID (which sounds better than HDIM). HDID provides continuous data protection, backup and archiving with storage-based snapshots, clones, and remote replication in addition to application server hosts.

 

In October of last year with the announcement of the new Hitachi Vantara company, we announced Hitachi Data Instance Director v6 which was re-architected with MongoDB as the underlying database. The more robust database enables HDID to scale to hundreds of thousands of nodes compared to previous versions which scaled to thousands of nodes. Now you can set up as many users as you want with access rights. Another improvement was an upgrade from a single login access control to granular role-based access controls to align user access capabilities to the business’ organizational structure and responsibilities.

 

Another major enhancement was a RESTful API layer which enables the delivery of recovery, DR and copy data management as a private cloud service. Rich Vining, our Senior World Wide Product Manager for Data Protection and Governance explains this in his recent blog post Expand Data Protection into Enterprise Copy Data Management:

 

“Hitachi Vantara defines copy data management as a method for creating, controlling and reducing the number of copies of data that an organization stores. It includes provisioning copies, or virtual copies, for several use cases, including backup, business continuity and disaster recovery (BC/DR), and data repurposing for other secondary functions such as DevOps, financial reporting, e-discovery and much more.”

 

Read Rich’s blog to see how HDID can solve the copy explosion that I described above by automating the provisioning of snapshots, clones, backups and other copy mechanisms, mounting virtual copies to virtual machines, automatically refreshing them and, more importantly, expiring them when they are no longer needed.

 

Think of HDID as a way to automate the copy data process and reduce the estimated $50 Billion spend on storage for copy data.