Skip navigation


8 posts

I was very interested to see the latest IDC data on external storage arrays*  and shocked to learn that spinning disk is not actually dead yet.  In fact, the market for Hybrid Flash Arrays (HFA) was still larger than All Flash Arrays (AFA), at least for this quarter.  So, what gives, why do spinning rust arrays still rate a mention at all?

The current situation reminds me of my bemusement when I arrived at Hitachi a few years ago.  I came from a networking background, not a storage specialist by any means, and I naively asked why we even need a category of “AFA” when you could obviously plug disks or flash into these boxes.  If you want an “AFA” just configure it with all flash, why was this even a category? 

“Who on earth hires networks guys?” Was the most common answer.

I remember there was much hand wringing among the sales team that we could not get a mention on the AFA quadrants because our VSP at the time could be configured by choice to be flash, disk or both.  To be eligible for the AFA quadrant, you needed to be only and exclusively configurable with Flash, and one needed to be on that latest and coolest quadrant to keep up with the new cool kids.

Well, as a non-storage guy at the time this had me totally confused.  To my mind this would be like creating an “All 100G” router category which would exclude routers that could support multiple interface speeds through pluggable optics.  To a network engineer, I would have found it ridiculous, and I’m sure my customers would have too.  So it was that I found an “AFA” category that excluded arrays capable of Flash or disk or both, quite perplexing.

It seemed to me that the AFA category had the effect of marginalizing vendors that had the ability to offer both in the same box, for no good reason at all.  “Quadrant Proliferation” I would have called it at the time.  Well, turned out I was “educated” that actually there are architectural benefits to excluding spinning disk and optimizing for Flash and it was on this basis that the separate category should exist.  So significant would these architectural differences be that AFA would blow an all-flash configured HFA off the map – obviously!

Well, I guess I could wear that, though in the same way that I would expect the same argument to apply for “100G only” routers, I would have expected this architectural benefit to show up in real world performance and not require the arbitrary protection scheme of a distinct quadrant.

Turns out the real-world performance was not the quantum leap that a separate quadrant would imply.  In fact, many vendors made the transition to the AFA quadrant through “SKU-Engineering” and made a variant of their HFAs available in All Flash only configurations. Guess what, they got on the quadrant and  kept up fine with the AFA cool kids on the performance front.

So, was the alleged performance benefit of restricting the media choice for the sake of a quadrant category really a win for customers? It certainly was a win for new vendors not having to support “legacy” media types but for many customers the tradeoff of cost vs performance for spinning disk was still worth it.  After all, if you are willing to trade off performance and cost to tier to “the cloud” why would a middle tier of cheaper on-premise storage not also be a choice you might want to make?

Of course, Flash is faster, better and getting cheaper and will obviously “win” the storage wars but not cheaper enough in less performance-sensitive uses to kill the disk just yet.  It may be shocking to some that HFA is still outselling AFA (at least this week) but it seems that choices rather than quadrants are still calling the shots.


According to a recent study conducted by IDC and Microsoft, Digital Transformation is expected to contribute more than US$1 trillion to Asia Pacific GDP in 2021. In today’s digitally-connected world, pressure on organizations to drive innovation and increase operational efficiencies grows. As many governments become advocates of this, IT leaders are being urged to adopt and accelerate digital transformation plans.


Many companies are now questioning where to start their journeys toward modernization. At Hitachi Vantara, we believe innovation begins at the fundamental level – with the infrastructure that lies at the heart of all digital economies. For companies to truly deliver, they need agile data infrastructure, modern data protection and intelligent operations, all of which are made possible with data center modernization.


The Foundation

Consumers are accustomed to on-demand, cloud-based solutions that are always available and highly personalized. Delivering on enterprise demands for real-time customer engagement requires an end-to-end approach to data management that can deliver low latency performance even as quantities of data grow exponentially.


Data center modernization begins with a foundation that can consistently deliver data at high speed, whilst supporting diverse workload requirements and multi-cloud integration. While growing companies often struggle with performance degradation as workloads increase, flash can deliver significant performance, response time and environmental benefits for high-performance applications.


One of our customers, Infosys, a global leader in consulting, technology and outsourcing solutions, rather than adding more servers which would increase performance lag, implemented Hitachi flash modules and tiering solutions. With large-scale transaction processing capabilities and higher performance, Infosys ended up delivering twice the performance and density of normal solid-state disks (SSDs) at half the bit price.


Digital Age Security

Data center modernization enables businesses to reduce operational overheads, eliminate inefficiencies and provide platforms that reduce data security risks.


From large financial institutions to small educational institutes, the challenge with data security is the same - eliminating the manual and costly processes that prevent businesses from rapidly capturing new opportunities and expose them to risks.


For public companies such as BOE Technology Group Company Limited (BOE), data reliability and protection are vital, particularly with workers traditionally storing data on personal computers. To address this, BOE turned to very large scale VDI to deliver secure desktops.  With so many desktops the company suffered boot storms that their previous storage solutions could not cope with. 


By turning to Hitachi Virtual Storage Platform and Hitachi Storage Virtualization Operating System, BOE were able to achieve the performance required for large scale VDI and help centralize the management of its core data for higher security and deliver undisrupted, high performance 24/7.


Applying Artificial Intelligence to Operations

As data center operations become more complex, operational efficiencies can deteriorate and the risks of downtime or data loss increase. To ensure the highest return on data center investments companies must enhance their infrastructure to increase operational efficiency.


By embracing artificial intelligence (AI), enhanced analytics and IT automation software, businesses can improve performance, reduce administrative time and costs, and quickly identify and eliminate performance bottlenecks. AI can provide deeper data center insights by looking across the entire data path, including virtual machines, servers, networks and storage.


Hitachi helps business and IT leaders respond to modern data challenges by leveraging predictive analytics and automation as part of their larger data management to ultimately enable innovative use of that data.


By embracing predictive analytics and IT automation software, companies can deliver the agile data infrastructure and intelligent operations needed to fuel their digital transformation initiatives.



For more details on BOE and Infosys case studies mentioned in this article please check out these links:


There is a lot of mysticism surrounding the Internet of Things (IoT) as it stands today. However, when you strip this away it boils down to the ability to connect things more cost effectively and interact with those things to produce new capabilities or levels of understanding.


To understand the real value of this connectivity, we first need to understand how this technology developed.


Stage I: The Dawn of the Internet

The world went through a similar transformation when we moved from isolated computer systems to computers that were connected together via what we called the ‘Internet’. 


The connection in and of itself was not really the point, it was what it could provide us with that was important. No one ever really got excited about installing a network card and a router, but they certainly wanted access to the websites that this technology enabled.


In the early days, the value of access to websites was self-evident. The Internet enabled us to access products and gain knowledge of mankind from around the world, moving the home computer from a hobby to something essential in modern society.


Stage II: Monetizing Connectivity

With the explosion of access and never-ending stream of content, also came the desire from companies to profit from it, which resulted in a number of monetization models in the early days. 


One was to attempt to charge users for the content itself, but the problem was people refused to pay for content when similar alternatives could be easily found for free.


Another idea, largely borrowed from broadcast television, was to provide the content for free, and instead insert advertisements into what people were viewing. This seemed simple enough but it was a waste of a technology that by definition allowed two-way communications with its users.


Stage III: The Internet of Us

Then the era of the Social Network dawned, providing users with a free platform that encouraged them to create and post content themselves.


Some of these Social Networks struck gold. Armed with burgeoning analytics technology, they were able to gather relevant insights with a great magnitude of user provided unstructured data and enable advertisers to be far more targeted with their content, thereby making their efforts more effective and valuable.


In fact, both Social Networks and IoT are based on some similar concepts if we consider their fundamentals:

  • There is a ‘thing’ you want to interact with
  • Technology enables you to connect to that ‘thing’ and gather data from it
  • These ‘things’ are connected to some form of network which carries data to a core collection point
  • The data is analyzed and combined with readings from many other ‘things’
  • Implications are drawn from the analysis of all this data in concert and something valuable is derived


In essence, Social Networks built the first platforms of the IoT era, and we were the first ‘things’.


The True Value of Connectivity

When it came to the Internet, it was not connectivity that provided the pay off, but analytics. The ability to derive insight from a massive number of data points is what really started to turn connectivity into money.


This all holds true with Industrial IoT, where we believe that through the connection of many physical world devices, we can make revolutionary improvements to processes. In fact, the connectivity of the devices is what we need to do, but the ability to rapidly analyze the data is what makes this endeavor truly useful.


An understanding of the nature of the machines and operation processes is necessary. This can then guide the analytic process to inform data mining with the suitable analytics tools and uncover the maximum value from the data.


It pains me to say this as a network engineer, but in industrial IoT, the ‘Internet’ part is the most readily commoditized. And whilst it is fundamental and essential, it does not deliver the most value. Instead, true value lies within the ‘Analytics of Things’.

On September 19th 2017, at the Hitachi Next event, Vantara was unveiled to the world.  This was the unification of three companies, HDS, Hitachi Insight and Pentaho. However, it was a lot more than just a merger and a name change, it was a signal that we are serious about changing our future and embracing a bold new direction.


Up until this point I had been part of the HDS story for about 18 months.  When I was originally speaking to HDS about joining the company, the vision that was presented to me was one of a future where insights around data were wedded to a century of industrial operational experience to produce a kind of IoT utopia. This was just as well because I had no intention of joining a storage company that had no concept of how it was going to undergo its evolution to the next phase, given the world I had just come from, networking hardware, which was struggling to articulate just such an evolution.


This vision that was articulated to me 18 months or so ago was compelling, and so I signed up.  It did, however, quickly become clear to me that as HDS, we were carrying an association with enterprise grade storage, well-deserved and envied, but also in many respects hampering our ability to convince our customers that we had much more to offer.  The creation of the Vantara brand has already begun to pay dividends in this regard, with many of our customers, analysts, the industry and at large sitting up to take notice.


There have been a number of interpretations of the announcements of September 19th by our toughest competitors, some willfully muddying the waters, some grasping what we were doing and hoping to emulate. Without mentioning any names, some said, “Hitachi is getting out of storage, stick with us pure play vendors”, others said, “Hey we are investing in IoT, we won’t call it IoT exactly but you get the idea”.


Both of these responses are problematic.  The first is a shallow attempt at misleading our customers with a thinly veiled misdirection that somehow suggests “IoT doesn’t need robust storage”.  Well, actually it does.   One of the key components as to why Vantara is embracing IoT is precisely because millions of devices running round the clock transmitting sensor readings generates a massive, probably unprecedented amount, of data.  The collection and analysis of this data to yield operational insights is at the heart of what we are doing here.


In fact, in the age of commoditization of hardware, and the apparent willingness to eschew enterprise grade storage engineering, there is still a home for exactly this quality in mission critical systems. Mission critical is the sweet spot of Hitachi’s reason for being in the world of industrial IoT.  So, no, we are not abandoning the ideals of building bulletproof, enterprise grade data infrastructure, in fact we are ensuring a future that continues to need it more than ever.


The second response, the “Hey we are doing IoT as well” line, falls even flatter.  The Hitachi proposition of better insights through data analytics probably can be emulated, but the idea that an IT rival can just go and acquire 107 years of industrial expertise is not even remotely realistic.  The Hitachi Vantara value, in being part of a much larger business with over a century of actual industrial success, is something that no amount of innovation workshops or hackathons or even building a compelling IT-side IoT platform can get you.


When Hitachi talks about the benefits to factory optimization that IoT can bring, we can say this because we have done this in our factories.  When we say that IoT can mean a revolution to transport systems, we can say that because we have done this with the rail systems we have built.  When we say that IoT can be a catalyst to change in healthcare, it’s because we have done exactly this in the medical machinery that we have … well you get the idea.


Hitachi Vantara is going to continue to innovate in storage and data infrastructure in general as this is exactly what the IoT-driven future demands.  We are going to be at the forefront of that IoT future as we have a unique blend of IT expertise and over a century of operational technology under our belts.


Hitachi Vantara, innovating in data by making the need for it more important than ever.

There’s little doubt in my mind that there have been two drivers in the current mania around what vendor marketers are calling ‘Digital Transformation’ or the rather less descriptive and even less appealing ‘DX’.


Having spoken to many organisations over the years, even before it was known as ‘Digital Transformation’, I have come to the conclusion that the motivators for such dramatic change are either fear or fascination.


The fear comes from the feeling that change is necessary because if you don’t change then some new upstart with an App and an Amazon account will destroy your industry.  The fascination comes from the feeling that there is something amazing to learn from this bright new visionary who is set on improving the world – with an App and an Amazon account.


From this starting point, the approach to digital transformation is formed and the seeds for success, or otherwise, are sown.  HDS recently sponsored some research with Forbes in regards to the subject and it was found that one of the key factors in success or otherwise for Digital Transformation projects comes from whether the project is embraced by the organisation as a whole or whether it had been relegated to the IT team to do. Sufficient to say it was the latter approach that almost always resulted in failure.


My feeling is that this could probably be traced back to the original driver of fear or fascination that gave rise to the dreaded or blessed Digital Transformation project.  In all likelihood if fear was the driver then someone handed the hot potato to the IT team as quickly as they possibly could. If fascination was the driver then chances are the lines of business were enamoured with the idea and truly wanted to make it their own, turning it into a project championed by all.


It seems to me that companies have seldom managed to transform as a result of fear, some people point to the demise of great logos of the past, I won’t list them here, but you probably have a mental list of them.  Was it a case that they were not afraid enough?  Many would suggest that they were so complacent with their success that they didn’t realise the barbarians were at the gate.  Was it that they did not have enough fear?


I’m not so sure.


Apple is often held up as the example of a company that has transformed itself a number of times emerging stronger every time.  They do not seem to fear anything and if you look at the great ‘Stevenotes’ of past Macworld events you can see the fascination for a future of possibilities and a willingness to be their own disruptors.


In the foundations of a Digital Transformation project one will generally find fear or fascination and I am sure one is much more likely to lead to success than the other.  Embrace change for the future, it’s fascinating.

According to Charles Darwin, it’s not the strongest of the species that survives, nor even the most intelligent. It is the one that is the most adaptable to change.


What’s interesting about Darwin’s theory is that it suggests that “innovation” in a species relies on accidental mutations that have been found to be beneficial over millions of years in an ever-changing environment.


We don’t have time for mechanisms like that in the world of business today but we do need that fundamental ability to innovate in an ever-changing environment. As the rate of change increases so too does the need for more rapid innovations in order to survive.  Accidental mutations are not really going to work as the innovation engine in such a compressed timeframe and so our ability to deliberately innovate needs to be assisted by extremely agile and adaptable underpinnings.


The Dawn of Digital Transformation


The ability to drive this rate of innovation calls for “Digital Transformation”. This can be thought of as a fundamental and accelerating change in a company’s activities, processes, competencies and models to fully leverage the changes and opportunities of digital technologies and their impact across society in a strategic and prioritized way that accommodates current concerns as well as future challenges.


Digital Transformation at its heart is about making the optimal use of data and providing the access to it and the ability to build platforms out of it to those people in the organization most likely to create innovation.  In effect giving the keys to the innovation engine to those most likely to yield great business “mutations”.


Every digital transformation begins with how an enterprise deals with data. Whenever I talk to customers about their own transformation journey, I keep hearing a core set of questions. How do I govern my data to ensure I’m meeting all my compliance and regulatory requirements? How do I mobilize my data for different apps, different users, and different purposes? How can I truly turn data into something tangible, into assets? How can I analyze all this data to turn it into insights to drive my business needs?  More fundamentally though there is an often-implied question that asks, “How do I get data into the hands of people most likely to be able to use it to come up with better or new business ideas”?


So, let’s look at answering some of these questions, and set out some stepping stones to help companies carry out their own digital transformation smoothly and efficiently.


It starts with infrastructure

At HDS, we talk a lot about the importance of modernizing data infrastructure to fully enable Digital Transformation, but what does this mean in practical terms for an organization?


We’ve identified five areas to focus on, which we feel are key to enabling Digital Transformation


1.     Accelerate data access: Forward-thinking IT organizations understand the importance of being able to respond quickly to an ever-changing market, but delivering this requires a fundamental shift in data access performance. It has now become conventional wisdom that “speeds and feeds don’t matter”.  Well that is true unless we are talking about several orders of magnitude differentials.  This is what we are faced with now in regard to the transition to Flash storage over conventional disk.  This is not an incremental speed difference, this is such a leap in performance that things not previously possible are now possible.  Flash provides the speed and agility needed to help IT infrastructure retain its competitive advantage and increase customer satisfaction. The shift to flash is critical for digital transformation, and solutions like Hitachi Virtual Storage Platform (VSP), can deliver data faster and improve customer experiences while keeping operations running at maximum efficiency.


2.     Simplify Data Center infrastructure: The next step is to simplify and move away from siloed infrastructure to an architecture that can deliver cloud-like agility, with improved efficiency and increased manageability. Hitachi’s platform of hybrid and all-flash converged and hyper-converged solutions, such as Unified Compute Platform, enable businesses to drive change, modernize infrastructure and stay ahead of the competition.  Most importantly these Data Center modernizations provide the building blocks for the real power behind a Digital Transformation agenda – freeing up people to innovate. 


3.     Secure data and operations: Every organization that is undertaking Digital Transformation has come to the realization that data is one of their most important assets.  With this comes an acceptance that we can no longer trade off availability for protection, we need both.  This important assumption demands a modern approach to Data Protection meaning businesses need to eliminate high costs and complexity to deliver flexible, scalable solutions that can protect all necessary data whilst retaining access at all times.  Hitachi has a number of ways we can help with this.  With Hitachi Global-Active Device, for example, businesses can ensure continuous operations for key applications thanks to nonstop, uninterrupted availability across multiple Data Centers.


4.     Liberate IT staff from recurring tasks: IT staff often struggle to keep up with the technology and management complexities of constant data growth and infrastructure changes. Storage tasks, in particular, provisioning, are often time consuming, require specialized skills and can be prone to human error. Many organizations today are looking for help in automating their environments but do not know where to start, we can help with that.  For example,  Hitachi Automation Director software lets organizations automate common and repetitive storage administration tasks. With its predefined templates and integrated service-builder tool, it can improve productivity and control rising storage costs, freeing up precious IT resources to focus on delivering Digital Transformation.


5. Leverage cloud to innovate: The digital economy demands new applications and services to continuously feed the new “fail fast” mechanism of digital evolution.  The ability to be able to try an idea and ramp it quickly if it succeeds and remove it easily if it fails can only be delivered by cloud. By taking advantage of an integrated, application-centric, private cloud platform, enterprises can truly provide the keys to the innovation engine to the people with the best business ideas. Hitachi Enterprise Cloud does just this - accelerating cloud deployment and lowering the cost of management to deliver best-in-class cloud capabilities.


According to IDC1, Digital Transformation is going to change the way enterprises operate and reshape the global economy. As Digital Transformation efforts shift from 'project' or 'initiative' status to a strategic business imperative, we are now reaching an inflexion point that marks the “Dawn of the Digital Transformation Economy”.


The question that enterprises need to ask is whether they are willing to evolve the way they deal with digital challenges? Because, if they aren’t, or even wait too long, they risk going the way of the dinosaurs.

This is the time of year when traditionally those of us with title like “CTO” are asked to take some time to think about the trends and priorities we expect will have the most impact in the next 12 months. Looking back at 2016, the IT sector has been dominated by the realization that it needs to make better use of its data and digital assets and become leaner and more agile than ever before.


With digital transformation appearing the agenda of CIOs across the world, we can expect to see an increasing number of organizations striving to reach digital maturity in 2017. This is backed up by recent research from Forbes Insights, which revealed that 42% of organizations in Asia Pacific consider themselves either advanced or leaders in terms of their digital transformation journey.


Aside from the implied continued rise of digital maturity, there are some other key trends that I expect will gain some ground in 2017:


#1: Productivity Gains Will Be More About People, Process and Business Outcomes

Despite the explosion of new technology over the past 10 years, productivity has declined compared to the previous decade according to the Organization for Economic Co-operation and Development. This is possibly because processes have not kept up with new technologies.


As a result, agile infrastructure, migrating to cloud, and the benefits of DevOps will gain greater attention as a way to speed up the development and deployment of applications and services with the winning combination of less error, yet less effort required. The ultimate aim of this is to reduce IT “busy work” and reassign intellect towards innovation.


Agile infrastructure and cloud delivery models will empower those with the most business insight and most innovative ideas to drive their own technology projects. 2017 will be about putting the power of innovation into the hands of those with the best ideas.


#2: Accelerating Transition to Cloud

When speaking of a transition to “cloud” I like to differentiate between moving to “The Cloud”, which most people intuitively equate with public cloud and moving to “cloud” which people often confuse with public cloud when they might mean modern, agile, self-service infrastructure even if it is largely based on premise.  I think we will see the continued rise of both types in 2017 and discussions around both continue to be relevant.


The Asian market has been relatively quick to embrace the Cloud, with Asia leading the world in this year’s Cloud Readiness Index.   My feeling is that, IT managers across APAC will be focused on developing skills in Cloud monitoring, Cloud workload performance and security management, as well as Cloud capacity management. This will support their moves toward big “C” Cloud.


For little “c” cloud I feel customers will want more simplicity and automation for those pieces that remain on premise.  Instead of buying infrastructure from different vendors and knitting them together with management software, IT will want access to the converged systems that can most easily deliver infrastructure-as-a-service without being a science experiment. By combining converged solutions, like Hitachi’s Unified Compute Platform, with cloud management solutions, like VMware vRealize, enterprises can deliver a pre-engineered approach for private, and hybrid cloud via a single management interface.


#3: Bimodal IT

Bimodal IT refers to two modes of IT – 1. Traditional: emphasizing safety, accuracy and availability and 2. Nonlinear: emphasizing agility and speed. While many may wish for the ability to simply do away with legacy application stacks and start afresh, the reality of the need for business continuity built on well understood and supported mission-critical systems continues. I expect bimodal IT to continue for some time yet, possibly in the same way that we will see both big “C” and little “c” cloud for some time yet, and for similar reasons.


From a storage perspective it is important that data from both IT modes can be leveraged, so organizations will look more to systems that can bridge the gap between the two. This means the ability to present cloud protocols, the capability to be instantiated on premise or in public clouds and to facilitate data mobility between these environments. Tools like Pentaho Enterprise Data Integration, that can bring together the data warehouse of mode 1, with the unstructured data of mode 2 to provide users with a clear view of all their data, will gain significant traction. 



#4: A Centralized Data Hub

Data is becoming increasingly valuable. Recent IDC research revealed that 53% of organizations in the region consider big data and analytics important and have adopted or plan to adopt it in the near future. Companies are finding new ways to correlate and merge data from different sources to gain more insight, while repurposing old data for different uses.


Highly disruptive Internet-based businesses have shown that the ability to wield data effectively is extremely valuable. To ensure the governance and accessibility of this data, IT needs to create a centralized data hub for better management, use and protection of data. This “repository of everything an organization knows” will need to be an object store that can scale beyond the limitations of traditional storage systems, ingest data from different sources, and provide search across public and private clouds as well as mobile devices.



#5: Growing Awareness of IoT in the Data Center

The decisions we make in IT in 2017 should be made with an eye towards IoT. The integration of IT and OT with analytics is the first step. Today, IoT requires data scientists and researchers with deep domain expertise and most projects are in the proof-of-concept stage. Hitachi Data Systems, along with other divisions in Hitachi and outside partners, is developing an IoT core platform, Lumada, to develop and deliver baked IoT solutions that are open, adaptable, verified and secure.


Next year’s trends are being driven by a clear enterprise demand to deliver on all the promises of digital transformation, with APAC set to lead the way. Regardless of industry, IT is seeing a fundamental shift as enterprises embrace the new revenue streams, efficiencies and possibilities provided by digitization.


To hear more of my thoughts on APAC tech trends for 2017, please join my upcoming Webcast here on December 14, 2016.


Technology and banking have always shared a critical bond. From the minting of the currencies of ancient empires, to the introduction of world’s first ATM in 1967, through to today’s highly digitized banking sector. It is unsurprising therefore, that those in the industry are under constant pressure to bring new, compelling products to market, providing customers with easier and more efficient experiences.

The Asia-Pacific region, which includes several key financial hubs, has seen skyrocketing investment in banking solutions in recent years. According to Accenture, investment in FinTech in Asia-Pacific increased from US$880 million in 2014, to nearly US$3.5 billion in the first nine months of 2015.

So we recently got together with IDC Financial Insights to take an in-depth look at the state of IT in banks across the Asia-Pacific region and examine how the latest technology trends in Digital Transformation are creating new opportunities, and some challenges, for those in the sector.



Feel free to register and download IDC’s Financial Insights infographic here.


According to IDC Financial Insights, there has been an average annual increase of 6% in IT spending in the last four years. However, 80% of banks see high total cost of ownership (TCO) in infrastructure related investments. This has driven many to adopt solutions designed to improve cost efficiency, a key factor in the rising popularity of all-flash storage in the region. Many anticipate massive change in the sector, predicting 15% of the traditional banking business will be gone by 2018, outpaced by more agile companies.


Those in the banking industry perceive the sector as being weak in many aspects of innovation, calling out 360-degree insights into customers, operations and risks as key focus areas. 90% still cite regulatory compliance as a ‘major concern’ despite the increase in their budgets. The IT market has been quick to respond to this, creating solutions to help those in regulated industries gain visibility and improve governance of their data, for example Hitachi Content Platform. When it comes to security, 25% of banks had faced a Severity 1 incident several times in the past year, and with downtime costing banks in the region an average of US$1.4 million per hour, these are breaches that banks can ill-afford.


The changing priorities and focuses brought about by Digital Transformation are creating new opportunities for companies and vendors to stand out. Asia-Pacific has plenty of examples of early leaders that are shifting their approaches and strategies so they are primed for greater transformation.


Tier 1 banks in Hong Kong and Singapore are streamlining data processing so they can quickly detect any anomalies that could pose a threat to security. One of Australia’s ‘Big Four’ banks has implemented a turnkey system, allowing it to reduce the costs associated with meeting Dodd-Frank compliance. Banks in India are integrating storage solutions to reduce capex and embracing Pentaho-embedded analytics to enable the adoption of new mobile banking services through greater business insight. Meanwhile in China, tier 1 banks have consolidated their infrastructure, with one able to realize a 45% reduction in IT costs. Change is taking place across the region and the good news is that, according to IDC Financial Insights, 50% of the infrastructure spending of Asia-Pacific banks can be optimized by 2018.


The question for many will be ‘where to start?’ Here are a few steps to getting it right:


  • Optimise TCO and improving core application performance are important first steps. 
  • Take a smart, real-time approach to compliance – Use IT assets built for compliance to ensure growth, create a centralized comprehensive data set for compliance, view IT as a true data broker and run mission-critical workloads through virtualization and converged frameworks
  • Become an always-on enterprise - Ensure high availability for truly mission-critical apps, invest in always-on infrastructure and best practices, and adopt a zero RTO/RPO strategy to eliminate traditional IT failovers
  • Improve your capacity for innovation - Use data integration and analytics to identify growth, retain customers, optimize operations, mitigate risk, detect fraud, and ensure compliance. Adopt analytics to enhance data warehouse investments, optimize the management and use of data from various sources, using object storage for unstructured data


Embracing this kind of transformation offers huge potential for those in the Asia-Pacific banking sector. Companies looking to start their journey can sign-up to my upcoming webinar with IDC Financial Insights, where we will discuss some of these findings in more detail and answer any questions you might have Digital Transformation.



Russell Skingsley, Chief Technology Officer, APAC, Hitachi Data Systems