Skip navigation
1 2 3 Previous Next

Hu's Place

265 posts

Various world organizations and financial institutions have sought to develop a digitization index to evaluate a country’s ability to provide the necessary environment for business to succeed in an increasingly digitized global economy. The following study was done in 2016 and is interesting since it shows a strong correlation between digitization and GDP per capita. The most advanced digitization countries have a higher GDP per capita.

Digitization chart.png

Where a country falls on the digitization index depends on how this index is defined. However, there have been several studies done using different digitization indices and in most cases the ranking of the countries were very similar. BBVA has developed a working paper to define a Digitization Index (DiGiX) which is widely accepted. This is a comparative index which rates the countries from zero to one. The BBVA study shows Luxemburg with the highest rating of 1.0. Although the GDP chart above uses a composite index, the digitization index on the x-axis ranks the countries in a similar manner to the BBVA DiGiX. The Y-axis is an assessment of GDP per capita without the impact of oil revenues. This may not be valid since digitization could influence oil revenues in a similar way that it impacts all revenues.

 

The following slide shows the types of indicators that were considered in BBVA’s DiGiX.

Digital Index.png

 

One anomaly on this chart is South Korea. South Korea is one the most technologically advanced countries, with technology leaders like Samsung. LG, and Hyundai. Korea Telecom dazzled everyone at the Winter Olympics with its display of 5G networks. 92.4% of the population use the internet and South Korea has consistently ranked first in the UN ICT Development Index. However, the GDP for South Korea, is below that of countries that are much lower on the digitization index. This shows that digitization is not the only factor in driving GDP.

 

Korean workers.png

The South Korean worker is the hardest worker in the world, logging some 300 work hours more per year than workers in most other countries, and yet productivity and prosperity is lagging behind less digitalized countries. One of the reasons given is that their retirement systems is not adequate to support retired workers and many of them have to find low paying jobs after retirement. Up until recently the retirement age was 55 and young workers had to look forward to supporting their parents and grandparents. According to the New York Times sky-rocketing household debt, high youth unemployment and stagnant wages are hobbling the economy. Young people have to scramble to compete for a small pool of jobs at large prestigious companies or accept lower paid work at smaller companies. Many cannot find work and the unemployment rate for young people is 10%. These are political and social issues that go beyond digitalization.

 

Lim Wonhyuk, professor of economic development at the KDI School of Public Policy and Management was quoted in the New York Times offering this suggestion: “The government needs to nurture a business ecosystem that is more ably disposed to start-ups protecting their intellectual property rights and giving them better financial access and incubating and supporting them.” What is needed is a way to spur innovation and disrupt the business models of established businesses.

 

Up until recently South Korea was not known as a tech startup hub. Very few “Unicorns”, (startup companies that achieve $1 billion in valuation) have come from South Korea. That may be about to change according to Crunchbase, an  publisher of news covering the technology industry. Crunchbase has noted in a recent article, South Korea Aims for Startup Gold, that there is an increase in VC funding and a competitive lineup of potential Unicorns.

 

Another interesting country on the chart above is China. While this shows China to be very low on the digitization index, China leads the world in innovation when it comes to the number of Unicorns. According to Business Insider, Chinese retailers JD.com, Alibaba, and Moutai hold the top three spots on the list of top 20 fastest growing brands, while Amazon holds the thirteenth spot. That has a lot to do with the fact that China has the largest population in the world, close to 1.4 billion. That also lowers China’s GDP per capita to about 8600 USD. Consider what it would do to China’s GDP if the digitization rate were to increase by 10 points!

 

Digitization clearly shows a positive impact on GDP per capita. However, digitization does not occur without the focus of government since digitization has a lot to do with infrastructure, connectivity, and regulations. As a result, we are seeing government digitization initiatives around the world.

MyRepublic Candy.png

On one of my trips to Singapore in June 2016, I was invited to meet with a new startup in the telco space. A company called MyRepublic. MyRepublic was started in 2011 to leverage Singapore’s exciting Next Gen NBN roll-out – the first of many National Broadband Networks (NBN) happening in the Asia Pacific region. By January 2014, this startup was able to launch the world’s first 1Gbps broadband plan in Singapore under S$50. The Telco industry is a well-established industry with many large players, with billions of dollars in revenue, so I was interested in knowing how a startup could possibly disrupt the telco industry and be the first to launch such a service. This disruption was on the scale of an Uber in the transportation business or Airbnb in the hospitality business! Startups like MyRepublic have become known as Telcotechs, a term that is being increasingly used in the telco industry and is similar to the use of Fintech in the financial industry to refer to technically innovative companies that are disrupting their industries.

In an interview in 2014, their founder Malcolm Rodrigues attributed MyRepublic’s strong growth to what he called the “thin operator model” “We think we’ve re-engineered the economics of a telco,” said Rodrigues. “Today we bill and invoice about 25,000 customers. All the invoicing and the CRM system are in the cloud. When I was at a telco before MyRepublic, we spent about $300 million dollars on an IT platform. [At MyRepublic] we built a cloud-based CRM and accounting system using the best stuff available and stitching it together through open APIs. I’d say we spent about 80,000 bucks to do that, and our running cost is around 3,000 dollars a month”.

When asked how he expected to make money with such a small number of customers, he said “The telecom is a beautiful business. It’s all recurring revenue. When you’re selling software packages, you have to find new customers every month.” By offering a utility, customers tend to switch providers only after a long period of time. At that time in 2014, MyRepublic occupied about 1% of the internet service provider market with hopes to reaching 5% in a few years. While that is a small slice of the pie, they were able to triple their revenue that year from S$5M to S$15M. Rodrigues also believes that traditional telcos create a walled garden to discourage users from going to other services like WhatsApp or Skype. There is a new way of life that is coming that will require telcos to work closely with the content providers. He believes that MyRepublic must be committed to user experience and maintain credibility in the eyes of the public.

When I met with MyRepublic in 2016 they were providing ultra-fast internet service to over 30,000 homes and businesses in Singapore. Since they were cloud native they were not interested in Hitachi’s IT infrastructure solutions. Their interest was in the trends and directions in the industry and Hitachi’s IoT initiatives. Last week I participated in Hitachi Vantara’s Southeast Asia CIO and Partner conference where we were fortunate to have Eugene Yeo, Group CIO, for MyRepublic, as a main speaker. He provided an update and shared his experiences on driving business agility and transformation across their footprint which now includes Singapore, Australia, New Zealand and Indonesia, with plans for Cambodia, Myanmar, Malaysia, Philippines, Vietnam, and Thailand. Today, MyRepublic has a customer base reaching over 200,000 households.

Eugene Yeo.png

This summer MyRepublic moved into the telco space as a mobile virtual network operator (MVNO) by partnering with StarHub in Singapore and Tata Communications across Singapore, New Zealand, Australia, and Indonesia. An MVNO is a wireless communications services provider that does not own the wireless network infrastructure over which it provides services to its customers. It obtains bulk access to network services at wholesale rates, then sets retail prices independently, using its own customer services, and billing support systems. This enables MyRepublic’s telcotech platform expansion into home broadband and mobile services without having to make any capital investment into its own mobile network infrastructure or services management. MyRepublic is able to provide the most competitive pricing across all its many services.

Eugene is a firm believer of using IT as a strategic tool to ensure that MyRepublic remains innovative and ahead of the competition. The tech innovations at MyRepublic include embracing the cloud, open source, and a whole slew of disruptive technologies to grow the business. They developed their own business support systems (BSS) and operations support systems (OSS) and adopted public cloud for the BSS and OSS stack. Yeo is also a staunch supporter of open source. The company has partnered with RedHat to deploy OpenStack in early 2017. With a team of 70 to 80 engineers they use open source - from open source data base, to open source libraries, and workflow engines to get to the next level faster. They can deploy to new markets in less than 60 days!

Recently, MyRepublic engaged Hitachi to help accelerate the company’s telecommunications technology strategy and enhance business operations across their customer base. They selected the Hitachi Vantara’s Pentaho Data Integration and Business Analytics platform to help it expand further into the region and to launch mobile services on top of existing broadband services. MyRepublic is using the Pentaho platform to improve productivity around data storage and operational efficiency, and to provide enterprise-grade scalability. With the Pentaho platform, MyRepublic was able to integrate and blend data from disparate sources and then create the necessary dashboards with just two software engineers. Pentaho also allows MyRepublic to easily embed dashboards within its BSS, CRM and back-office systems so that users can access insights while working within the operational systems.

MyRepublic Pentaho.png

Eugene Yeo said. “While we have made significant manpower savings, the bigger benefit is the robust data pipeline we’ve been able to build. Pentaho allows us to add data to this pipeline rapidly, which is important to this vision. Similar to what fintech players achieved with the financial services industry, it paves the way for us to create new data monetization models that will lead to further innovation in the industry.”

MyRepublic has plans to IPO within 24 months. We are very proud to be a partner with them and pleased to be able to support them with our products and services.

VMworld 2018.png

Data centers are digitally transforming from being an infrastructure provider to a provider of the right service at the right time and the right price. Workloads are becoming increasingly distributed, with applications running in public and private clouds as well as in traditional enterprise data centers. Applications are becoming more modular, leveraging containers and microservices as well as virtualization and bare metal. As more data is generated, there is a corresponding growth in demand for storage space efficiency and storage performance, utilizing the latest in Flash technologies. Enterprises will need to focus on a more secure, automated data center architecture that enables micro-segmentation and artificial intelligence, while reducing complexity and increasing flexibility.

 

Infrastructure choices can significantly impact the success of digital transformation (DX). The inherent benefits of converged infrastructures (CI) and hyperconverged infrastructures (HCI) are gaining in popularity for enterprises on the DX journey. Converged and hyperconverged infrastructure can deliver business and IT value in new and evolving ways:

  • Improved system performance and scalability
  • Simplified management and operations
  • Single point of support
  • Reduced costs to deploy and maintain

 

Hitachi Vantara provides three unified compute platforms

 

Hitachi Unified Compute Platform CI Series(UCP CI)converged infrastructure systems combine the power of high- density Hitachi Advanced Server DS series servers built on the latest Intel Xeon Scalable Processor technologies and award-winning Hitachi Virtual Storage Platform (VSP) unified storage arrays. Flexible options let you configure and scale compute and storage independently to optimize performance for any workload, whether a critical business application or a new digital service. Automated management through Hitachi Unified Compute Platform Advisor (UCP Advisor) software simplifies operation and gives you a single view of all physical and virtual infrastructures from one screen. VSP G series and F series arrays take advantage of Hitachi’s unified technology and industry-leading all-flash features and performance. Networking is provided by a Brocade Fibre Channel, and Cisco IP switches. UCP CI targets midsize and large enterprises, departments and service providers who need the flexibility to combine mode-1 business applications with modern cloud-native services in a single, automated data center package.

 

Hitachi Unified Compute Platform HC (UCP HC) offers a scalable, simple and reliable hyperconverged platform. The UCP HC family simplifies the scale- out process and provides elasticity to closely align the IT infrastructure with dynamic business demands. Start with what you need and scale to keep pace with business growth, without committing massive capital upfront. UCP HC leverages x86 CPU and inexpensive storage, integrated with VMware vSphere and vSAN to reduce the total cost of ownership. It delivers high VM density to support a mix of applications, eliminating the need for storage sprawl. Modern data reduction technologies (deduplication, compression, erasure coding) reduce storage need by up to seven times to boost return on investment (ROI) by leveraging NVMe flash hyperconverged infrastructure. Recently introduced NVIDIA GPU-acceleration dramatically improves user experience in modern workspaces. You can read datasheet here. Success story of South-African retailer Dis-Chem pharmacy highlights the compelling business benefits of Hitachi Hyperconverged solution.

 

Hitachi Unified Compute Platform RS (UCP RS) seriesaccelerates your move to hybrid cloud by delivering a turnkey, fully integrated rack-scale solution, powered by VMware Cloud Foundation. This rack-scale UCP RS solution combines the hyperconverged Hitachi UCP HC system, based on industry-leading VMware vSAN, with NSX network virtualization (SDN) and software-designed data center (SDDC) management software to deliver a turnkey integrated SDDC solution for enterprise applications. The SDDC manager and Hitachi Unified Compute Platform Advisor (UCP Advisor) non-disruptively automate the upgrades and patching of physical and virtual infrastructure. They free IT resources to focus on other important initiatives. The SDDC manager automates the installation and configuration of the entire unified SDDC stack, which consists of nodes, racks, spine-leaf switches and management switches. UCP RS introduces new resource abstraction and workload domains for creating logical pools across compute, storage and networking. And dramatically reduces the manual effort required and fast-track the availability of IT infrastructure for private and hybrid cloud.

 

All these solutions will be on display at the Hitachi Vantara booth #1018at VMworld Las Vegas, August 26-30. Dinesh Singh has blogged about what we will be covering at this VMworld. There will be solution demos, breakout sessions where you will learn best practices from your peers like Conagra Brands and Norwegian Cruise Line. There will also be short and crisp, 15 min theater presentations running every hour, covering topics like NVMe storage, data analytics, SDDC hybrid cloud, and many more. Some of our alliance partners like Intel, Cisco, and DXC Technology will join some of these sessions. You will also be able to meet our technology experts who are developing the next cutting-edge solutions.

 

Be sure to save 4:30 PM on your schedule for August 28, for the popular Hall-Crawl where we will be serving delicious sushi and sake at Booth #1018.

Flash.png

 

SHARE is a volunteer-run user group for IBM mainframe computers that was founded in 1955 and is still active today providing, education, professional networking, and industry influence on the direction of mainframe development. SHARE member say that SHARE is not an acronym, it is what they do. SHARE was the precursor of the open source communities that we have today.

 

The mainframe market is alive and well and may be on the verge of a renaissance in the coming IoT age. We have all seen the staggering projections for 30+ billion new internet connected devices and a global market value of $7.1 trillion by 2020. That is almost 8 times the estimated 4 billion smartphones, tablets, and notebooks connected today. That translates into a staggering amount of additional new transactions and data, which means compute and data access cycles, as well as storage. That many new devices connected to the internet also opens up many more security exposures.

 

These are areas where the mainframes excel with their unique architecture of central processing units (CPUs) and channel processors that provide an independent data and control path between I/O devices and memory. With z/OS, the mainframe operating system, is a share everything runtime environment that gets work done by dividing it into pieces and giving portions of the job to various system components and subsystems that function independently. Security, scalability, and reliability are the key criterions that differentiate the mainframe; and are the main reasons why mainframes are still in use today especially in high transaction high security environments like core banking. These same capabilities will be required by the backend systems that support IoT.

 

Hitachi Vantara is one of the few storage vendors that support mainframes with its scalable VSP enterprise systems. We are a Silver Sponsor of the coming SHARE St Louis 2018 from August 12 -17. We will be sponsoring a booth #324, where you can meet with our mainframe specialists to answer your questions or input your requirements.  We will also be presenting the following topics:

 

Tuesday, August 14, 4:30 PM - 5:45 PM

Hitachi Mainframe Recovery Manager reduces Risk and simplifies Implementation Effort

Hitachi Mainframe Recovery Manager (HMRM) is a simpler, more focused and lower-cost streamlined mainframe failover and recovery solution which can provide you with the functionality you actually care about and nothing you don’t.

Room: Room 100

Session Number: 23402

Speaker: John Varendorff

 

 

Wednesday, August 15, 11:15 AM - 12:15 PM

Hitachi Vantara G1500 Update - Latest and Greatest

The Hitachi Virtual Storage Platform G1500 continues the evolution of Hitachi’s virtualized storage architecture with advances in the technology which have increased performance, expanded usability with active-active stretched clusters, extended storage virtualization with Virtual Storage Machines (VSMs), and added non-disruptive scale out capabilities to extend the life of the system.

Room: Room 267

Session Number: 22876

Speaker: Roselinda Schulman

 

Thursday, August 16, 12:30 PM -1:30 PM

Hitachi Vantara, Data Resilience and Customer Solutions for the Mainframe in the IOT Age

Please join Hitachi Vantara for a discussion on the IOT Age and how it effects mainframe environments today and moving forward into the future.  This session will discuss Data Resilience and its importance, Technologies for now and the future, and Analytics and its growing importance in companies across the globe.  We will discuss what solutions Hitachi Vantara has today for mainframe environments and where we may be going in the future.

Speaker: Lewis Winning

 

Please visit our booth and sessions at SHARE St Louis, and see how our mainframe systems, software, and services will support the security, scalability, and reliability required by the age of IoT.

It’s been two weeks since my cancer surgery and this is my first blog post since then. In Hitachi Vantara there is a major focus on developing IoT solutions in various industries including healthcare so my experience with chemotherapy, surgery, and hospital care was very interesting from a recipient point of view. It gave me a much greater appreciation for the tools that are required for healthcare today and where there needs to be improvements.

 

Surgey.png

The first need is for the sharing of information between all the healthcare providers involved in a patient’s care and the optimization of the mix of medications and treatments that are required by each discipline, oncology, surgery, cardiology, endocrinology, urology, neurology, psychology, etc. Many of these medications have side effects which may interfere with different treatments, and some have to be suspended or altered when a new disease or condition needs to be treated. Each condition alone has many different treatment options. In my case the DNA of my cancer cells are being analyzed to determine the best treatment. Hitachi provides content platforms and content intelligence systems for centralizing and sharing large volumes of data. Hitachi also has many projects applying AI and machine learning tools, working with major medical facilities around the world. Most that I know of are around targeted studies, like optimizing the mix of medication for personalized diabetes control or cardiac sleep studies, so there is a lot to learn through AI and machine learning about the interactions of different medical conditions.

 

My chemo treatment consisted of a portable pump which I carried for 48 hours every two weeks which enabled me to continue most of my work prior to the recent surgery. The purpose of the chemo was to isolate and shrink the tumors which were in my liver and small intestine, so that they could be surgically removed. Chemo treatment consists of infusing your body with a cocktail of poisons that inhibit the growth or kill the cancer cells.  Unfortunately, it kills healthy cells as well. The purpose of the DNA study was to find a way for my immune system to attack this particular cancer. That study is still on going. The chemo mix that was given was called "5FU". If you look it up on social media, it is often referred to as “5 feet under” for its effect on some patients. There needs to be a faster way to develop safer immunological treatments. Since immediate treatment was required, I opted not to wait for the DNA study and went with the chemo and surgery.

 

The surgery was another area for advanced technologies. The surgeon described the procedure which was amazing. Mounted in front of him in surgery were two screens which showed an MRI scan and a CT scan that were taken earlier to identify the locations of the lesions. These are cross-sectional views of my body which he has to correlate with the longitudinal view of what he sees on the table in front of him. What he sees, is through a 6-inch incision in my abdomen. The MRI and CT scans are point in time views of flexible organs which are changing in real time as the surgeon starts to work on the organs. The targeted lesions are in organs that are hidden behind other tissues and organs. Somehow, using a real-time ultrasound imaging system during surgery, he was able to locate the lesions, which are as small as several mm in size, and excises them. The large tumor in the small intestine had to be extracted from the surrounding tissues and lymph nodes and the intestine reconnected to the stomach. He removed these tissues which will be preserved for future studies. Prior to the operation I signed a permission that allows Stanford Research to use these tissue samples until the year 2502! The surgery took 5 hours of very concentrated, intricate, skillful work! You can imagine the physical strain that this relatively minor surgery would put on the surgical team. Imagine what an organ transplant would require! While Hitachi Healthcare provides tools to assist surgeons, like the real-time ultrasound imaging systems that was used in this surgery, it may be some time before robots will be able replace the skilled surgeons in these types of surgeries.

 

This experience has given me a much greater appreciation for the possible benefits of AI, machine learning, and robotics that companies like Hitachi are working on. Most of you know someone who has gone through similar life experiences, perhaps even in your own lives. This gives a new level of urgency to what we do at work. It’s not just about the pay and recognition. It is really about social innovation and the difference it can make in our lives.

Bill Schmazo.png

 

I am very pleased to welcome Bill Schmarzo to our Hitachi Vantara community. Bill is very well known in the Big Data Community having authored two books, “Big Data: Understanding How Data Powers Big Business” and "Big Data MBA: Driving Business Strategies with Data Science."  He is considered the Dean of Big Data. He’s an avid blogger and frequent speaker on the application of big data and advanced analytics to drive an organization’s key business initiatives. Bill teaches at the University of San Francisco (USF) School of Management, where he is their first Executive Fellow.

 

Bill joins Hitachi Vantara as a Big Data Analytics Visionary, CTO IoT and Analytics. In this role he will guide the technology strategy for IoT and Analytics. With his breadth of experience delivering advanced analytics solutions, Bill brings a balanced approach regarding data and analytic capabilities that drive business and operational outcomes.  Bill will drive Hitachi Vantara’s “co-creation” efforts with select customers to leverage IoT and analytics to power digital business transformations.  Bill’s background includes CTO at Dell EMC and VP of Analytics at Yahoo.

 

When he joined Hitachi Vantara, he posted two blogs on LinkedIn which I am linking to below to help you understand his thinking and his perspective on what we are doing at Hitachi Vantara.

 

In the first post, Bill’s Most Excellent Next Adventure, he explains why he left Dell EMC with a heavy heart after 7 marvelous years and decided to join Hitachi Vantara. Here is a quote from that post. Please take the time to read the complete post.

 

I will be joining Brad ( Brad Surak, Chief Product and Strategy Officer ) at Hitachi Vantara – the digital arm of Hitachi – as their Chief Technology Officer for Internet of Things (IOT) and Analytics.  I will have a chance to leverage nearly everything that I have been working over the past 20 years.  And instead of just talking, writing and consulting on Digital Transformation, this time I will get the chance to actually do it; to get my hands dirty and bloody my elbows making it all work.  It truly is the best job I could have dreamed off, and if this will be the final chapter in my working career, then damn it, let’s make it a great one!

And I promise to take you all along on this most excellent adventure.  You can learn first-hand what’s working, what’s not working and what we all can do to make Digital Transformation a reality. You will learn from my successes and learn even more from my failures.  And in the end, I hope that we will all be a bit smarter from the experience.”

 

In his second post, My Journey Through the Looking Glass: Hitachi Vantara Introduction Bill gives his impressions on his first week in Hitachi Vantara.

 

I spent my first week at Hitachi Vantara attending sales kickoff, and wow, was I impressed! Hitachi Vantara is making a pivot to build upon its core product and services strengths to deliver data-driven innovations around data monetization and business outcomes. And I could not be in a happier place (sorry Disney), as this is everything that I have been teaching and preaching about the past 30+ years; that organizations must become more effective at leveraging data and analytics to power our customers’ business models.”

 

In this post he goes on to give his basic views on Big Data, digital transformation, IoT, and analytics. Business Model Maturity, helping customers with their digital journey, the why of digital transformation (starts with the business), monetizing your IoT, Digital Transformation value creation, Digital Transformation customer journey mapping, and providing links to previous blogs in which he addressed these topics. Please read this post and link to these topics for a wealth of great insights. I expect to see even more as Bill begins posting on our Hitachi Vantara Community.

 

He concludes this post with the following comment:

 

“I can’t say enough about how energized I am about where Hitachi Vantara is going from an IOT, analytics and digital transformation perspective. It matches everything that I have been teaching and preaching over the past many years, and I am eager to build it out in more detail via hands-on customer engagements.”

 

I have been reinvigorated just by meeting Bill and hearing his validation of our strategy and direction. Please welcome Bill Schmarzo to our Hitachi Vantara Community!

Hu Yoshida

Beyond Cool!

Posted by Hu Yoshida Employee Jun 20, 2018

In my last blog post I commented on Hitachi Vantara’s selection as one of the “Coolest Business Analytics vendors”by CRN, Computer Reseller News, and expanded on Hitachi Vantara’s business analytics capabilities. CRN’s report positions business analysis tools at the top of the big data tools pyramid to derive insight and value from the ever-growing volume of data. In this post I will be expanding on how we address the rest of the big data pyramid.

 

Data Fabric.png

 

Other analysts and trade publications like Network World also refer to this as a Big Data Fabric which is gaining more attention as analytics become a driving force in driving business outcomes. Whether you think of this as a big data pyramid or a big data fabric, the concept is a converged platform that supports the storage, processing, analysis, governance and management of the data that is currently locked up in different application silos and is the biggest hurdle to overcome for developing meaningful and accurate business analytics.

 

A comprehensive big data pyramid or big data fabric must provide features and functionality, such as data access, data discovery, data cleansing, data transformation, data integration, data preparation, data enrichment, data security, data governance, and orchestration of data sources, including support for various big data fabric workloads and use cases. The solution must be able to ingest, process, and curate large amounts of structured, semi-structured, and unstructured data stored in big data platforms such as Apache Hadoop, MPP, EDW, NoSQL, Apache Spark, in-memory technologies, and other related commercial and open source projects, and do it simply, efficiently, and cost effectively.

 

The strength of the information fabric you weave is directly affected by the quality of data you stitch together. For any organization that is actively investing in anything remotely close to a data lake, the focus must be on data quality before data use.  A key point that many organization seems to miss in determining the worth of their data is that just because data is being collected, that does not mean that organizations are collecting the right data.  They may be either collecting very little of something very important or not collecting the right data at all. Data quality impacts business effectiveness

 

Effective data quality is only reliable when it occurs as close to the point of data creation, and long before it is given asylum in the data center or blended downstream for some other business purpose.  This is important because it is here that Hitachi Vantara really shines.  Both Hitachi Content Intelligence and Pentaho can be used as, “Data Quality Gateways” designed and implemented in the data stream to affect data veracity and bolster the source of truth expected out of the information fabric.  Regardless of whether we are talking about discovery, orchestration, management, governance, control, preparation, etc. focusing on the quality and correctness of data is what makes the information fabric reliable and trustworthy.  More importantly, when you perform these veracity activities is up to you – that is the power offered by Hitachi Vantara’s solutions.  Certainly, we would continually suggest that it be well before data is stored in your data center, but we do not force that best practice on our customers.

 

Additionally, our two products allow you to sub-segment your dataset based on the business outcome that is desired.  For example, if you are working with data sets in a manner that allows you to answer very specific questions based on known datasets in a time-sensitive manner, Pentaho provides the right solution for you with additional capabilities to blend and visualize that data.  If the business outcome is based more on exploratory activities across multiple datasets and allows for time to conduct the exploration, then Hitachi Content Intelligence provides the ideal solution.  Both can process the data instream and at contextual levels.  Both offer the ability to leave the data where it resides or augment and migrate the data to our Data Services Platform (Hitachi Content Platform) where it can be stored for its lifespan in a compliant, self-protected, and secure manner.

 

Data is a reusable commodity, where value may be gained from different data points and can continue to provide additional and valuable insights that may not have been imagined in the first analysis. Hitachi Content Platform is the ideal repository for long term retention of Big Data with its geo-no backup-data protection, secure multitenancy, governance and security features, extensible metadata, self-healing reliability and availability, low cost erasure coding storage, cloud gateway, and the speed and scalability that leverages the latest advances in infrastructure technology.

 

This is a very high-level view of what we provide for Big Data Fabrics. In subsequent posts, I will expand on some of these concepts which differentiate us in the big data space. Hitachi Vantara has the most comprehensive set of big data and big data analytics tools built around our integrated Hitachi Content Platform with Hitachi Content Intelligence, and Pentaho solution sets.

Hu Yoshida

How Cool Is That!

Posted by Hu Yoshida Employee Jun 12, 2018

CRN, Computer Reseller News, a leading trade magazine, has named Hitachi Vantara as one of the 30 Coolest Business Analytics Vendors.This may be a surprise for many to see Hitachi Vantara, part of a 118 year old company with traditional values like Harmony (Wa), Sincerity (Makoto), and Pioneering Spirit (Kaitakusha-Seishin), in the middle of a list of technology startups.  Hitachi and Hitachi Vantara considers business analytics to be one of the key drivers for our customer’s success in this age of big data, digital transformation and IoT and is approaching business analytics with the same “startup” or pioneering spirit that has sustained us for over 118 years.

 

Cool Analytics.png

Hitachi Vantara’s appearance in this list of 30 “cool” companies may also be surprising from a “coolness” standpoint. Most of these companies are hip new startups. The next oldest company is Microsoft, who like us, have had to reinvent themselves, many times to remain relevant.

CRN.png

Actually, Hitachi Vantara is the new kid on the block since it was formed in September of 2017 with the merger of Hitachi Data Systems (IT infrastructure systems), Hitachi Pentaho (Data Integration and analytics), and Hitachi Insights (IoT). CRN recognizes that Hitachi Vantara is able to provide, “cloud, Internet of Things, big data, and business analytics products under one roof.” CRN cites Pentaho as a core Hitachi Vantara product for data integration, business analytics and data visualization. CRN also mentioned Pentaho’s new machine learning orchestration tools, available as a plug-in through the Pentaho Marketplace, to help data scientists, better monitor, test, retrain, and redeploy predictive models in production

 

We have registered over 1,500 licenses of Pentaho enterprise users. However, since Pentaho is open source, with a thriving community of open source users, there are hundreds of thousands of open source users and we are adding about 5K to 10K new users per week. While Pentaho positions us to have a place on this list, there is much more to what Hitachi Vantara can provide for big data and business analytics.

 

CRN’s report positions business analysis tools at the top of the big data tools pyramid to derive insight and value from the ever-growing volume of data. Hitachi Vantara focusses on the entire pyramid since the insights and value are only as good as the data that goes into it.

Pyramid.png

While Pentaho is a core product in our analytics portfolio, we have other analytic tools like:

  • Hitachi Content Intelligence is part of our Hitachi Content portfolio that automates the extraction, classification, enrichment, and categorization of data residing on Hitachi Vantara and third party repositories, on premise or in the cloud.
  • Hitachi Data Streaming Platform provides proactive data streaming analytics to transform streaming IoT data to valuable business outcomes.
  • Hitachi Video Analytics can drive new business success through insights into customer behavior and preferences.
  • Hitachi Infrastructure Analytics Advisor uses machine learning to prescribe optimal IT infrastructure performance SLAs to improve user satisfaction, simplify budget forecasting with predictive analysis, and accelerate fault resolution using AI to diagnose root cause analysis, prescribe resolution and enable admins to automate fixes.

 

Hitachi Vantara also has the good fortune to be part of a larger Global Hitachi corporation that has operational expertise in many industries, from healthcare, to energy, to transportation systems. This expertise is critical in developing industry or business specific analytic models and automation tools that drive business outcomes.

 

CRN put together this list of 30 business analytics companies for the following purpose:

 

“…we've put together a list of 30 business analytics software companies that solution providers should be aware of, offering everything from simple-to-use reporting and visualization tools to highly sophisticated software for tackling the most complex data analysis problems.”

 

Hitachi Vantara is proud to be recognized as one of the 30 Coolest Business Analytics Vendors by CRN Big Data 100. We congratulate the other members of this list. Since big data and analytics requires an ecosystem of vendors, I am sure that we will be working with many of these vendors as we are already working with vendors like Microsoft and SalesForce. We will be working with many more vendors and customers as we continue to develop the pyramid of big data tools that will be required to address our customer’s business requirements.

 

"Cool" wasn't in anybody's vocabulary 118 years ago, but the essence was captured in Harmony, Sincerity, and Pioneering spirit.

In order for companies to be more agile in responding to changing customer needs and market dynamics, they must have a storage infrastructure that makes data available at the right time, at the right place and in the right format, so that they can derive value from it and turn raw data into insights that drive business outcomes. With the explosion of data and the increasing demands on that data, data centers must focus more on the data and the information that can be derived from it than the storage infrastructure that supports it. In the past when data centers were considered as cost centers, the efficiency of data centers were measured by how many TBs of capacity could be managed by one full time employee. Now data centers are measured on how fast they can drive innovation in delivering data, information, and applications.

Cogniant.jpg

However, the storage infrastructure, is still very important. It must support many more development platforms and applications, must scale quickly to meet demand, comply with increasing availability and governance requirements, span the breadth of requirements from edge, to core, to cloud, and seamlessly incorporate new technologies in order to be on the leading edge. So, the question is how does a data center maintain a leading-edge storage infrastructure when the focus needs to be on business outcomes?

 

Storage infrastructure workloads can be reduced through a shared services or a managed services approach.A storage service with automated tools for configuration management, analytics for optimization, and central management of copies, clones, replicas, and backup, can greatly reduce operations and personnel costs and improve efficiencies. This will help data centers free up more resources to focus on development and applications as they transform into an information technology driver for business transformation.

 

Buying storage as a service breaks the traditional buying cycle, where you over buy enough storage capacity for the next 5 years at today’s price and technology when you know that both the price and technology are likely to undergo major changes over those 5 years. If your capacity is100 TB today and growing at a 20% CAGR, you have just bought 250 TB for the next five years when you only needed 120 TB for the next year. What happens in year three when you find that the business requirements have changed, and you need more or less capacity, or the performance of your storage puts you at a disadvantage to competitors with newer technology storage? Storage as a service provides the agility to buy what you need when you need it and frees you up to respond to future business requirements such as support for big data initiatives, smart products and services. Storage as a service helps enterprises meet business goals with improved commercial viability, optimized and efficient data management

 

Hitachi Vantara has partnered with Cognizant Technology Solutions, a world leading professional services company to bring industry leading storage as a service (STaaS) solutions to small, medium and enterprise companies to facilitate the transition of their data center models for the digital era. The Hitachi Virtual Storage Platform(VSP) scales from small to enterprise with enterprise ready software-defined storage, advanced global storage virtualization, and efficient workload consolidation. In addition to the VSP, Hitachi Vantara provides AI driven Hitachi Infrastructure Analytics Advisor to analyze and recommend improvements in operations, simplify budget forecasting and accelerate fault resolution, with Hitachi Automation Director to improve performance, avoid bottlenecks, and optimize end-to-end performance and resiliency, and Hitachi Data Instance Director to optimize data protection and copy data management.

 

The combination of Cognizant’s proven professional services leadership, bespoke solution designing, vertical business experience, deploying and managing large and complex storage infrastructure for customers coupled with Hitachi Vantara’s industry leading storage hardware and software solutions, will provide a new level of Storage Solutions that will enable sustainable IT transformation from Infrastructure to Information.

 

STaaS is an integrated software defined storage service that reduces cost, minimizes complexity, manages growth efficiently andhelps customers transition their storage environment securely from ‘Plugin-Migrate-Run-Retire’ model to consumption based ‘As-a-Service’ model

 

sappurpose-player-wide.jpg

This week I had the pleasure to be a panelist on SAP Radio. The Moderator was Bonnie D. Graham, Global Thought Leadership Media Director and the Creator / Producer / Host of SAP Game-Changers Radio. Other panelists included Karin Underwood, a first-year MBA student at the Stanford Graduate School of Business where she is a co-president of the Social Innovation Club and winner of the Impact Design Immersion Fellowship. She represented the views of the next generation of business leaders. Also, Katie Morgan Booth who leads Corporate Social Responsibility (CSR) for SAP North America, joined us to give her perspective from a large 10,000 employee technology and information company.

 

The question we were to address was “Can doing good using social innovation be good for your company’s bottom line?”

Serving the world.png

I pointed out the growing social challenges due to the increase in world population, mega cities, climate change, and dwindling resources, which provides an opportunity for corporations to develop new markets, enable new consumers, and create an environment that will enable social innovation and business growth.

 

Karin, the MBA student from Stanford, said that the notion that young people entering the workforce want to earn a lot of money and return it later is broken for many of her classmates. They want to find a way to do big, impactful things in the world early in their careers, and businesses have a huge opportunity to show their commitment. The nature of work has changed, and employees are voting with their feet. When businesses focus on only maximizing profits for their shareholders, they are making short-term choices that can hurt their ability to attract top-quality talent and to create economic value and returns for society in the long-term.

 

Katie, the CSR Director from SAP,  was concerned about the pace and scale at which innovation was accelerating and was concerned with its impact on people, organizations, and communities. We need to prevent people from being left behind. Inclusive education and workforce readiness programs are crucial to economic, social & environmental sustainability, as well as future innovation. We have the responsibility to meet people where they are and provide them with the hard- and soft-skills needed to secure employment in a digital workplace. We need responsive solutions and coordination from all parts of society – governments, citizens and private industry alike – to re-envision an educational system based on lifelong learning that can fully prepare workers for the jobs of the future. In CSR we have the unique opportunity to get our employees out into their communities, take them outside their comfort zones, and show them how others live, and the challenges that others face. When they see this, it often gives them a push to go further and deeper to learn about social issues and try to improve them. Many people take that spirit and challenge the status quo within our company, within their own jobs and products they work on. The more someone can be challenged the more they build empathy to other’s experiences, and that is what is necessary to move the world forward.

 

I can relate to Katie’s comments on CSR. Although I work on Hitachi’s Social Innovation Strategy and the different technologies that can drive innovation and create sustainable social change as well as business outcomes; it is important to keep myself grounded in what this means for individuals as well as the general society. This last weekend, I participated in a Relay for Life in my home town of Morgan Hill, just 30 miles south of where I work in Silicon Valley. Relay for Life is a volunteer fund raising event for the American Cancer Society. Teams are formed to walk in a 24 hour relay, around the local community park. I joined my daughter Elizabeth's team. Each team has a tent where they do fund raising events, like raffles, or sell different crafts that the teams have made. and distribute educational material about the different forms and treatments for cancer. More importantly it is a time to share and support each other. The teams are formed by neighbors, friends and families who come together to honor the memory of a cancer victim, support a cancer patient or to celebrate a survivor. We had the opportunity to hear firsthand, from cancer survivors and care givers, like the young mother who was first diagnosed with cancer 9 years ago, went into remission, but recently learned that the cancer had returned. Hearing these individual stories from the people in our community, adds urgency to everything we do.

FullSizeRender.jpg

The conclusion of our panel was that Social Innovation will be good for a company’s bottom line and if companies are to attract the new generation of innovative business leaders, they must focus their business strategies on more than maximizing profit. Social Innovation should also be a personal goal for each of us to build a healthier, safer, sustainable world for everyone.

 

The future is full of social challenges that will drive a need for Social Innovation. Addressing this need will require collaboration across government, business, and non-profits. In order for these innovations to be sustainable, corporations will need to integrate Social Innovation into their business strategy so that profits are tied to social innovation.

 

Many of the social challenges will come from an exploding world population. According to Wikipedia the world population grew from 2 billion in 1927 to 7.6 billion in 2018 and is expected to grow to 9.8 billion by 2050. The most rapid growth will be in countries with lower standards of living, and people in countries with higher standards of living will be living longer, putting a growing strain on health care and retirement systems. However, if the growth in population should decline, there may be even greater problems as fewer young workers struggle to support a burgeoning elderly population. Soylent Green is not the type of solution that we would like to contemplate.

soylent-green-year-1973-usa-director-richard-fleischer-movie-poster-EJ260B.jpg

More people will consume more resources, like water, energy, infrastructure, goods and services. Some of these resources are already very limited like clean water, carbon fuels, and the rare earth metals required for new technologies. More people will also create more pollution and waste which will lead to health issues, climate change, and food shortages. With climate change we will see an increasing pattern of floods, droughts, global warming and rising sea levels that threaten to inundate our coastal urban areas.

 

By 2050, 70% of the world’s population will live in cities, where problems like unemployment, slums, crime, homelessness, traffic congestion, sanitation, urban sprawl, and overwhelmed social services will increase. More and more people will migrate from rural areas seeking a better life but without the skills for urban employment, and refugees fleeing poverty and oppression in other countries will create challenges for integration into the mainstream society. Social problems will be especially challenging for mega-cities with populations over 10 million. For example, many of these mega-cities are hundreds of years old, built on antiquated underground water and sewage systems that are difficult to update without disrupting the infrastructure above it. Istanbul, a mega-city that is over 1000 years old loses an estimated 34% of their potable water due to leakage.

 

There can be tremendous opportunities for companies to achieve great profits in such densely populated areas. Many analysts predict that new markets like IoT could reach $1 trillion by 2025. This could create tremendous wealth for some and create a wider gap between the haves and have nots. Gentrification, the influx of more affluent residents into urban neighborhoods, can drive up the cost of housing and retail space, displacing poorer people and small businesses, creating an even wider gap. This will lead to alienation and discrimination which is the breeding ground for terrorism where individuals or a groups of individuals take violent action against the public to advance their political, religious, or ideological goals. Public safety will be a major concern for urban areas, particularly urban areas with high visibility.

 

There is a clear and pressing need to address social problems if we are to have a healthier, safer, and sustainable lifestyle for ourselves and our children’s children. Although many companies have a CSR (corporate social responsibility) program where they donate funds to social causes, sponsor charity events, and encourage employee participation in outside social initiatives, this does not go far enough. Wanting to do good is not enough. Sustainable social change can only happen when corporations integrate Social Innovation into their corporate strategy for delivering business outcomes. By helping to solve social challenges, a Social Innovation strategy will help corporations build a sustainable business model and long-term viability. They can do this by building new markets, strengthening supply chains with access to sustainable resources, investing in talent diversification, enabling new consumers, and helping to create a social environment that is conducive to business growth. Social Innovation includes the process of transforming an idea or invention into a solution that creates value for society and stakeholders.

 

Corporationss that are only focused on short term profit, may be first to market with new technologies, but may create even greater social problems. The late Stephen Hawkings warned that AI could be the worst event in civilization and he urged creators to employ best practices and effective management when creating AI. Hitachi integrates the principles of Kaizen in their approach to AI so that AI is used to empower workers rather than displace them.

 

Hitachi has always been conscious of the environment and the need for social innovation beginning in 1910 when Hitachi was established to build electric motors to improve the efficiency in mining operations.  In 2009, during the global economic crisis, Hitachi announced our strategy to strengthen our Social Innovation business. Hitachi announced that we would be “concentrating to build a more stable earnings structure with a focus on the Social Innovation business, which comprises social infrastructure supported by highly reliable and highly efficient information and telecommunications technology.” When you map that 2009 statement on Social Innovation business forward to today, you can see how it fits to our IoT strategy comprised of OT (social infrastructure) and IT (information and telecommunications) technologies. Through IoT, Hitachi will be able to address many of the social problems described above while improving our bottom line. 

 

Since then, Hitachi has delivered innovative solutions from clean water to smart cities that have addressed social problems and improved our bottom line so that we can continue to grow the company and invest in new Social Innovation projects. As our Hitachi Vantara CEO Brian Householder has described it, we are working to a double bottom line, delivering solutions and outcomes to benefit business and society.

Hitachi Future.png

While the acronym IT stands for Information Technology and is synonymous with the data center, in reality the focus of IT has often been more on infrastructure since infrastructure represented the bulk of a data center’s capital and operational costs. Digital transformation and the need for more agile business outcomes requires the transformation of IT from an infrastructure focus to an information and application focus.

 

Triangles.png

Digital transformation is not just about being more efficient in what we normally do. Digital transformation means turning everything upside down and changing our focus. Digital transformation for the data center means focusing on business outcomes through Information and applications that support your customer. The customer doesn’t care what vendor’s infrastructure you have or what you have to do to manage the infrastructure as long as he gets the service that he expects.

 

The figure above is not meant to suggest that Infrastructure is not important. On the contrary Infrastructure becomes more important since it needs to support a host of new and changing development platforms and applications and must be more agile and flexible, leveraging the latest technologies to meet changing business requirements. Data is the fuel that drives digital transformation and since data is persistent, (it will outlive the applications that created it and the technologies that store it) the choice of storage Infrastructure becomes even more important. In the figure above, I would place storage infrastructure at the tip of the spear when it comes to digital transformation. However, the question is how do you change your focus away from storage infrastructure when it is so important? The answer lies in a smart approach to data center modernization.

 

Data center modernization requires an agile data infrastructure, modern data protection, and intelligent operations.

3 Pillars.png

An Agile Data Infrastructureleverages the latest advances in technologies like flash, compression and dedupe, to scale performance to millions of IOPs and GB/S of bandwidth and scale capacities to petabytes with multiple millions of volumes and snapshots. An agile data infrastructure includes software that scales from small rack scale storage to multi-frame enterprise systems, consolidate block, file, and object storage, with support for virtualization, cloud and persistent storage for containers.  Virtualization is a key differentiator for consolidation of heterogeneous vendor storage systems and non-disruptive migration to future storage systems.

 

Modern Data Protectionprovides 100% data protection which is backed up by a written guarantee. It provides dual active storage systems for zero recovery time and zero recovery point availability and synchronous and asynchronous replication. Modern data protection eliminates the need for traditional backup and simplifies the management of copies. Modern data protection also includes security and privacy features that are designed in to protect the data whether it resides on edge devices, mobile devices, in the core or in the cloud. While Hitachi will do everything to protect your data, we want you to own your data – not us. Where encryption is required, we give you control of the encryption keys.

 

Intelligent Operations, will enable you to harness the power of data to improve and automate operational efficiency, anticipate customer demand, and generate new revenue streams. As the demands on the data center increase, operational efficiencies can deteriorate and exposure to down time and data loss increases. An AI powered brain is needed to provide deeper data center insights, by looking across the entire data path from virtual machines, servers, networks and storage, using machine learning to optimize, trouble shoot, and predict data center needs. AI can be integrated with an automation engine that can orchestrate the delivery and management of IT resources to free up data center operations staff to work on information and applications that support business requirements. The automation tool should also integrate with other infrastructure services: IT service management tools like ServiceNow®for tracking and control of IT resource delivery, REST API for provisioning of third party resources including storage, and data protection tools like Hitachi Data Instance Director to protect against data loss and downtime.

 

 

New Enhancements to Hitachi Vantara Data Modernization Offerings

Hitachi Vantara recently announced enhancements to its Agile Data Infrastructure and Intelligent Operations portfolio.

 

The Agile Data Infrastructureportfolio includes new enterprise-class Hitachi VSP models include the all-flash VSP F700 and VSP F900 and the hybrid flash VSP G700 and G900 systems. To reach a broader range of customers, Hitachi is introducing new midrange models. They include the VSP F350, F370, G350 and G370 systems. The systems are powered by the next generation of Hitachi Storage Virtualization Operating System, SVOS RF, which has been re-architected f to deliver the following improvements:

  • Up to 3x IOPS improvement
  • 25% lower latency
  • 3.4x faster performance with data reduction enabled
  • Up to 2.5x greater capacity scalability, 8x more volumes and 1 million snapshots
  • Modern workload support including plug-ins for Dockers and Kubernetes containers
  • Backed by 4:1 data efficiency guarantee

For Intelligent Operations, Hitachi Vantara has integrated and enhanced its AI operations software portfolio to ensure the highest return on data center investments and to accelerate strategic outcomes. The new integration of Hitachi Infrastructure Analytics Advisor(HIAA), Hitachi Automation Director(HAD) and Hitachi Data Instance Director(HDID) simplifies data center management and sets the foundation for autonomous operations across the data center.

  • HIAAimprovements:
    • Predictive analytics for better forecasting future resource needs
    • AI driven heuristic engine for recommending fixes and repairs up to 4x faster
    • HIAA integration with HAD to automate implementation of HIAA recommendations
  • HADimprovements:
    • Automated configuration of Hitachi Data Instance Director (HDID)
    • Automated configuration of VSP QoS
    • Integration with IT Service Management (ITSM) tools, including ServiceNow, for improved tracking

 

To see how this new generation of all flash and hybrid flash Hitachi Virtual Storage Platforms with its next generation Hitachi Storage Virtualization Operating System and the integration of Hitachi Infrastructure Analytics Advisor with Hitachi Automation Director and Hitachi Data Instance Director will enhance our modern data center portfolio and accelerate the transformation of the data center; see the following announcement letter and video:

 

https://www.hitachivantara.com/en-us/pdf/datasheet/vsp-g-series-hybrid-flash-midrange-cloud-solutions-datasheet.pdf

Recently I had the pleasure to meet with a group from KT, South Korea’s largest telephone company. It was very exciting to hear about their experiences at the Pyeongchang Winter Olympics where they partnered with major companies like Intel and Samsung with their 5G wireless network technology to deliver the most high-tech Olympic games in history! With the world’s telecom giants racing to unveil the world’s first 5G, KT was the first to provide a large-scale pilot service and show case it at one of the world’s most public venues, the Winter Olympics.

 

Inntel drones.png

 

The Pyeongchang games opened with dazzling display of a record setting 1,218 Intel drones with onboard LEDs, joined and orchestrated through 5G connectivity to a central computer. Intel provided live or on-demand VR coverage of 30 events, a project powered by the 5G network. Between 3 and 6 camera ‘pods’, each containing 12 x 4K video cameras were used for events such as speed skating, alpine skiing and bobsleigh. These cameras generated as much as 1TB of data per hour. While Gigabit Wi-Fi could have provided this speed and capacity, 5G provided ubiquitous coverage across the venue and real time control because of its low latency. In KT’s 5G pilot, a video demonstrated the speed and capability of 5G, using real-time, 360-degree video of athletes competing, displayed on a Samsung 5G display. It’s a feat that would be impossible on current 4G technology without buffering. Although the peak speeds for 5G could be 20 times faster than 4G. The 5G speeds reached in these trials were 4 time faster than 4G which still allowed for crisp streaming of the Game’s action from all angles.

 

In addition to video streaming and VR, other 5G enabled communications use cases included: Artificial Intelligence (AI) enabled robots to help inform and entertain fans and athletes in Korean, Chinese, Japanese and English. Self-driving busses served thousands of fans with safe, efficient transport between venues. The speed of the 5G network enabled them to receive information in real time from a central control center, helping them to avoid obstacles or crashing into other vehicles. The busses were also able to download and display 3D video files on transparent screens giving fans a front row seat even before they arrived at their chosen event.

 

Samsung equipped two Dutch speed skaters with smart suits with sensors positioned over the material to feed live body position data to the skaters’ coaches. Coaches were able to analyze their racers’ posture and make improvements using coded signals to a communications device on the skater’s wrist. While these were used only in training and not in competition, the Dutch team won Gold medals in seven of the ten individual speed skating events and four medals in short track speed skating. Over the years the Netherlands has built a dedicated culture and athletic infrastructure for producing speed skating superstars. What improvements could be possible with 5G technology in their training regimen?

 

This successful showcase of 5G technology will help to accelerate the development of standards and the delivery of commercial 5G networks in 2020 or sooner. While 4G wireless mobile technology revolutionized the consumer market and boosted the use of cloud, 5G is set to transform the edge for IoT and Industry 4.0. 5G is the fifth generation of wireless communications technology that will enable new kinds of users on the edge that require very low latency, low power consumption, low cost, high reliability, and exponentially higher data loads.  The sharing of information among systems like smartphones and robots will be enabled by 5G as well as bi-directional M2M communications which will provide data services that will differ from those that offer voices services for mobile phone users. Vertical industries like e-commerce, manufacturing, medicine, automotive, oil and gas, and logistics, are developing transformational business propositions on top of 5G.

 

Congratulations to South Korea for their success in delivering an amazing, high tech, Winter Olympics and to KT for providing the first large scale pilot of a 5G network which will help to advance the commercialization of this technology and unleash the full potential of IoT and Industry 4.0 applications.

 

Here is a short video of the amazing opening drone display at the Pyeongchang Winter Olympics

According to Wikipedia, Serverless computing is a cloud computing model in which the cloud service provider dynamically manages the allocation of machine resources. Serverless computing still requires servers. The name "serverless computing" is used because the server management and capacity planning decisions are completely hidden from the developer or operator. Serverless code can be used in conjunction with code deployed in traditional styles, such as micro services and run in containers. The drawing below illustrates what the cloud service provider provides and what the user developer or operator provides with serverless computing compared to an Infrastructure as a Service.

serverless.png

 

Serverless computing is provided by a cloud service provider like AWS Lambda. To use it you write code (in C#, java, Node.js or Python), set a few simple configuration parameters, and upload everything (along with required dependencies) to Lambda. This package is now what Lambda calls a function which can be automatically triggered from other AWS services or called directly from a web or mobile app. Serverless computing Lambda takes care of everything, all you do is provide the code and Lambda deploys the function in a container and provides everything required to run and scale the function with high availability. Lambda persists the container until the function has done its job, then disappears. Serverless computing is used with containers.  Another way to describe serverless computing would be Function as a Service or FaaS. AWS Lambda was introduced in 2014, and since then other cloud provides have rushed to provide similar capabilities.

 

The benefits, according to AWS includes no servers to manage, continuous scaling precisely to the size of the workload by running code in parallel based on processing individual triggers, and sub second metering where you are charged every 100ms that the code executes, and you don’t pay when your code is not running. Serverless computing is inexpensive. Serverless computing uses containers but does not need to deploy and manage the containers. It is low maintenance since you do not need to provision containers, set system policies and availability levels or handle any backend server tasks. The standardized programming environment and the lack of server and container overhead means that you can focus on writing code.

 

Serverless computing has some very definite limits. You are limited to the implementation constraints of the cloud service provider. For example, Lambda has built in restrictions on size, memory use, and time available for a function to run. There is also a limited list of natively supported programming languages and it is important to keep functions small since a few high demand functions can overload or lock everyone else out. Serverless computing runs in a multi-tenant environment so there is always exposure to speed and response time variations and outages due to the demands or bad behaviors of other tenants. Monitoring, debugging, and performance analysis capability may also be restricted due to lack of visibility into the backend services provided by Lambda. Since your software is hardwired into the providers (Lambda) interfaces, there is vendor lock in. However, if you Google serverless computing and vendor lock in you will see many arguments for the benefits, pro and con. 

 

So, what are the use cases for serverless computing? The best functions are short lived, small, and do not run for a lengthy period of time. Some functions that could fit this model are real time analytics that are triggered by anomalies in a data stream, ETL to perform data validation, filtering, sorting and other transformations before it loads the transformed data into another data store, and as the backend for an IoT application where sensors trigger the need for a spare part and a function automatically places the order. Here are a few uses cases that AWS Lambda proposes:

Lambda Use cases.png

In some ways serverless computing is the next abstraction beyond containers. A container provides the user with more control and portability, but that also comes with more administration.The main benefit of a container is that it consists of an entire runtime environment: an application, plus all its dependencies, libraries and other binaries, and configuration files needed to run it, are bundled into one package; it can run reliably when moved from one computing environment to another. Containers let developers deploy, replicate, move, and back up a workload even more quickly and easily than you can do using virtual machines. Container based application can be as large and complex as you need it to be. It would be easier to redesign a monolithic application into container based micro services than if you tried to redesign it with serverless computing due to the multiple bottlenecks based on size and memory constraints.

 

Serverless computing is also compared with micro services. Micro services are a change in architecture where a single monolithic application is broken down into a set of self-sustained small services running on their own machines (or instances.) They use light weight mechanisms like REST interfaces to allow communication into the micro services. Micro services can be reused in different applications, eliminating the duplication of work effort when the same service may be required by different applications. Micro services have an operational overhead which serverless computing does not have. It requires an underlying operating system which requires deployment and monitoring of the operating system for availability. There is also application deployment and configuration overhead and ongoing support and maintenance. With serverless computing, you leave that all to the cloud provider and you only pay for the time of use in 100ms increments. On the other hand, the advantage of micro services with containers is full control of the environment, while with serverless computing you are limited to what the cloud service provider enables for you.

 

Serverless computing services, micro services, and containers are not competing systems. They are complimentary. Serverless computing is another computing model that should be considered to increase agility and efficiency in code development and application deployment.

This week I had the opportunity to hear Michael Sherwood, Director of Technology and Innovation for the city of Las Vegas talk about the IoT innovations that he helped implement there. For Michael, IoT is less about the technology and more about the outcome in terms of making the city safer, smarter, healthier and saving money for the city through greater efficiencies. The bottom line drives the story and like Hitachi, he works to a double bottom line. One is the business bottom line and the other is the bottom line for society. This fits perfectly with Hitachi’s vision for social innovation.

 

Some of projects that he has been able to implement are a self-driving shuttle, smart Intersections, smart trash collection, and a smart edge network.

 

HOpOn.png

 

The self-driving shuttle project is called “Hop On”. The autonomous vehicle, which launched last November, is limited to eight passengers as it travels along a 3/5-mile loop in downtown Las Vegas. Locals and tourists may ride for free. If you have been to Las Vegas you’ve experienced the long walks between the hotels, where it doesn’t justify the cost of a taxi or Uber. However, walking between hotels can be exhausting. The city blocks are long, and the proportions of hotels are so gigantic that the perception of distances are distorted. The first glitch with Hop On occurred within hours of its launch. A semi truck backed into the shuttle before it could go into reverse to avoid the collision. Fortunately no one was hurt. The first lesson learned was to put a horn on the shuttle to warn other drivers.

 

Sensors and cameras at intersection can monitor the flow of traffic and optimize traffic signals to minimize the wait time for vehicles and pedestrians, cutting down on carbon emissions from idling cars. Cameras can monitor public places not only for safety, but also for trash collection. Now instead of scheduling trash on a time table, whether it was required or not, the trash can be collected on demand, freeing up workers to do other tasks like graffiti removal.

 

Las Vegas.png

 

Las Vegas owns the majority of its streetlights and the city is upgrading these fixtures to create an intelligent platform not only for lighting but also as a security and communications network. Each upgrade includes multi colored led lights, a module for a security camera system and options for fibre, WiFi, and cellular connectivity. Other modules that maybe included are gunshot detection and LIDAR (Light Detection and Ranging) that uses pulsed lasers to measure distances. When multiple streetlights are connected together, they can create a mesh network to enable automated actions like light driven alerts that are managed locally rather than requiring a connection back to a cloud.

 

When asked what Hitachi platforms are involved in these projects, he mentioned Hitachi’s Visualization Suite and Pentaho. Hitachi’s HVP camera solutions empower edge recording and compute, which is a key component to addressing network efficiency and policy for video storage/access. Michael emphasized the need for analytics which he believes will be the next big thing over the coming 5 years.

 

Michael was also asked if it is hard to attract talent to Las Vegas to work on these projects. Michael said that he did not have a problem in attracting the right people. The talent is drawn to these projects in Las Vegas, not for the money but for the vision and a chance to be part of something that is truly innovative.