Adrian De Luca

Big Data Defined IT: Business & Technology Predictions in Asia Pacific for 2015 (Part 3)

Blog Post created by Adrian De Luca Employee on Dec 22, 2014

(This is a continuation of a previous blog)

 

Continuing on our Back to the Future parody, you may remember the scene when Marty McFly, posing as his future son walks into the Café 80’s.  He is confronted by two artificial avatars, Michael Jackson and Ronald Regan competing to take his order.  Although we don’t have Max Headroom-like cyborgs playing maitre de in restaurants (but we can draw comparisons to Apple’s Siri), this scene illustrates how machines can be employed to improve the customer service experience.  Giving rise to my next prediction;

 

#2 - Competitive industries will ramp up Big Data initiatives to get competitive advantage

 

Although adoption of Big Data across Asia Pacific remains low with over half of organizations making limited headway, particular industries are are ahead of others.  Certainly businesses operating in competitive industries are no longer looking at Big Data as an initiative, but as an imperative.  As initial projects show promising new insights and customer engagement, competitors are making similar investments to drive a new “arms race” in key verticals.

 

The majority of big data adopters now appear to be banks and other financial services firms.  Employing correlated analytics on in-house data to assess things like borrower risks, churn detection and cross/upselling other products based on spending behaviours have helped a number of financial institutions retain their valuable customers as well driving more wallet share from them.

 

But this is only the beginning, as we are now seeing the next wave of projects which will provide even greater insight through mashing even more data sources.  An example of this is MasterCard’s intent to mine FaceBook from its Asia Pacific user base to uncover behavioural insights to sell back to the retail bank’s. This “insight enrichment” concept demonstrates how in-house personal data, public social media data and geographical location from mobile devices will enable a new level of customer engagement and revenue potential.

 

In the telco world, with 4G expecting to drive 14.5 times more data traffic than non-4G connections, Mobile Network Operators (MNO’s) are investing in software to analyse wireless data in flight to optimize networks for demanding content delivery like video. In this highly competitive market, the benefits are clear as ensuring quality of service reduces customer churn.

 

In countries like Indonesia where 168 million people are connected to the mobile network, service quality is a necessity to stay profitable.  This is where companies like PT Telkomsel is putting its data to work.  By understanding its customers usage patterns, they are using real time analytics to identify the ‘next best offer’ for their subscribers, with a view to moving customers onto higher yield service plans.  This is resulting in 2 million customers per month are being upsold to broader mobile plans.

 

Governments are also realising the potential of big data to improve citizen services.  With open data initiatives in countries like Australia, New Zealand, Singapore, South Korea, India and Hong Kong, thousands of datasets are now available for public consumption absolutely free.  For example in Singapore, data from travel cards, GPS on trains and buses and openly available data on schedules were used to draw up a detailed model of how the residents of city move through their transport system, resulting in a drop of peak hour travel of between 7% and 13%.

 

This next generation of business intelligence solutions will not only require new infrastructure architectures to store and manage vast data lakes of information, but combining industry specific data interpolation together various interconnected software platforms will be critical to delivering predictable and supportable projects.

big_data_infrastructure.png

In the world of Big Data applications, ingest of data can come from many places and in many forms.  Collecting bulk data (ie. enterprise data warehouses) though means like ETL versus streaming data (ie. machine/device data, social media feeds) where event and metadata need to be extracted require different profiles of network and compute bandwidth.

 

Similarly, storing this data is not as trivial as sticking into a traditional relational database.  The massive size and frequency of such data streams require things like Distributed filesystems, Memory Grids, Key-Value and Graph databases, in some casesall of them to help manage data at such scale.  As we see open source platforms like Hadoop gain greater adoption in the enterprise, organisations will be looking for support to implement, integrate and maintain such analytic systems.  This has led to a number of solution providers like MapR, HortonWorks and Cloudera  to expand their presence across Asia.

 

Processing and computing the volume and velocity of Big Data mean that traditional server architectures are inherently inefficient with this type of workload.  This is leading to a different class of server hardware specification, one which is optimized for in memory processing, scale out modular interconnects and software-defined provisioning and management programmability.  Projects such as the Open Compute Project and Novena have accelerated the development of these platforms, but enterprises will need a greater level of support to put them into production.

 

From a software perspective, PaaS can help accelerate many of the development and operational requirements of Big Data applications.  Much like the IaaS space, this is becoming an increasingly noisy with vendors like Pivotal, RedHat Openshift, Microsoft and Amazon all vying for their platforms to gain the greatest amount of market traction.  When it comes to visualization of data, organizations are looking abstract this layer as much as possible rom their underlying data management layer.  In many cases business are already using products like Pentaho, Tableau or Datawatch and would like to leverage these familiar user experiences in their enterprise.

 

Hitachi Data Systems has been partnering with SAP for a number of years to deliver one of the industry’s most scalable Hana appliance on the market.  In southern Australia, the power distribution industry depends on it for delivering analytics for smart meters, helping consumers understand their electricity usage behaviours.

 

In 2015, you will see even more integrated solutions from Hitachi Data Systems across a variety of industries that help organizations accelerate their adoption of big data.  This will not only be in the form of products, but services and local partnerships across the region so watch this space.

Outcomes