Skip navigation
1 2 Previous Next

Innovation Center

17 Posts authored by: Adrian De Luca Employee

It is, by all accounts, a humungous trade agreement. On 4 February 2016, 12 Pacific Rim countries signed the Trans-Pacific Partnership (TPP) agreement after seven years of negotiations. As noted in a BBC report, the countries have between them a population of about 800 million - almost double that of the European Union – and collectively, they are already responsible for 40 per cent of world trade.


The agreement, which is expected to be ratified over the next two years, promises to bring significant benefits to economies in the Asia Pacific. As I mentioned in my Predictions for Asia Pacific in 2016, the TPP will usher in new and exciting opportunities for companies to expand across the Asia Pacific region and if implemented properly, lower barriers to consuming intercontinental cloud services.


But to realize this potential, budding enterprises have to start preparing for a multi-cloud strategy today. Not only is it important to be aware of the dynamics that are shaping the cloud landscape in the region, but understanding how to a technology stack that provides universal deployment, overcomes bandwidth limitations and ease of mobilising workloads are key.


Firstly, its important to understand the cloud environment in the Asia Pacific has been evolving quite differently from that in United States and Europe. Here, there are relatively more instances of private cloud than public cloud as many Asian enterprises continue to want to own and control their IT assets. The sovereign nature of the geography also means there is unlikely to be just a handful of dominant cloud providers so although choice will drive cost benefits, it can provide technology risks.  These are just some of the reasons the cloud landscape in the region is also likely to be more fragmented than it is anywhere else in the world.


Another important dynamic is the regulatory landscape, which varies with different industries and different countries across Asia. Take the financial services industry (FSI) for example. In Hong Kong, the government shies away from the banking sector to operate anything in public cloud, whereas in Australia, there is no specific mandate around the use of public or private clouds. Instead, the regulatory focus is more around addressing privacy protection concerns and requires that the service provider can demonstrate that it meets the required level of governance.


A third dimension is infrastructure. Infrastructure development across the region is uneven. In India, for example, there are not many cities with well developed, high bandwidth networks that make it feasible for data center operators to offer cloud facilities.


So how do these factors come into play as enterprises work out their cloud strategy in anticipation of TPP?tpp-overview.jpg

First and foremost, there has to be a cultural change within many organizations before they can fully capitalize on intercontinental cloud services for trans-regional business. Re-examine your IT priorities and be prepared to relinquish some degree of control as you move more of their workloads out into the public cloud.


Develop a technology and cloud management stack that will not lock your applications in at all layers, building services on open source frameworks and platforms will ensure you to integrate into multiple cloud IaaS and PaaS environments and will give you the ability to deploy workloads seamlessly from one location to another and from one CSP to another.


Be aware of “proprietary” clouds. Some CSPs are claiming to be open becuse they are using open source technologies like OpenStack, but their services may not be replicable somewhere else because they are closed APIs or lack the data center ecosystems. Enterprises could find themselves locked in by the cloud vendor, making it difficult for them to expand into certain countries.


To manage the different regulatory approaches to the cloud across the region, enterprises have to ensure that their CSP abides by a firm code of conduct with respect to issues such as data privacy and disclosure of data breaches. In a report on FSI Regulations Impacting Cloud in Asia Pacific Markets, the Asia Cloud Computing Association noted that financial services institutions can maximise the benefits on offer from the use of cloud services by ensuring that they are familiar with the regulatory landscape, and able to assess the degree to which the CSP’s offering fits into the regulatory landscape.


Another consideration in developing a cloud strategy is the infrastructure development of the country where the CSP operates. Several IT providers like SubPartners and Megaport are already laying the groundwork by investing in cross-continental high-speed connectivity to create direct routes between key economies in South East Asia, Australia and the United States of America.


It is also a positive sign if the city or country is investing in data centers. We saw this happening in Australia about five years ago when customers began to realize that it made little economic sense to build and operate their own data centers. First they outsourced parts of their IT with co-location, next we saw the rise of cloud providers which gave rise to a massive data center buildout in 2012. We are now seeing same thing is now happening in countries like Singapore, Hong Kong, Taiwan and Malaysia.


Yet another factor that enterprises should consider is the availability of skillsets to support their multi-cloud strategy. Countries like Singapore and India, for example, provide a very good skills base for the development of the cloud ecosystem.


The TPP holds immense promise for business opportunities that if it can get the right political support, will unlock across this region.  However it will take an evolution in cultural thinking, level of governance and investment in physical infrastructure to make trans-regional business a reality. Organizations have to start building their applications, systems and cloud strategies today in anticipation of these developments.

If you have transited through any major Asian airports lately, its hard not to go past a glossy billboard pointing to just how fast cities in his region are changing.  But behind the worlds-best airports, glitzy new residential towers and state-of-the-art transport infrastructure lies a much bigger desire than just build beautiful cities, it is to build Smart Cities.


Here in Asia Pacific, the momentum has been fever pitch, fuelled primarily by one critical imperative; the huge population explosion.  According to the Boston Consulting Group, Asia happens to be home to 16 of the top 21 cities expected to experience the highest population growth.


In India, the government of Prime Minister Narendra Modi has outlined a vision to develop 100 smart cities as new satellite towns or by modernising existing mid-sized cities. In January, it announced the names of the first 20 urban areas to be developed, with a proposed investment of Rs 50,802 crores (US$7.5B) over a five-year period.


In China, authorities say smart city technology is a key national policy to help drive the country’s rapid urbanization. Efficiency in smart city development has even been adopted as a performance benchmark for local officials in some provinces as the country positions itself for a leadership role in smart city technology. According to a report in China Daily, the country’s accumulated investment in smart cities is on track to exceed 2 trillion yuan (US$322 billion) by 2025.


Singapore is another front-runner in the smart cities race, topping IDC Asia Pacific’s Smart Cities chart in 2015. Last year, the country announced about S$2.2 billion worth of government ICT tenders, with key areas of procurement focused on realizing the country’s Smart Nation vision, such as the development of the Smart Nation Platform that will enable public agencies to enhance their situational awareness and to deliver anticipatory responses and services to businesses and individuals.


According to research estimates from Frost & Sullivan, smart cities present a combined market potential of US$1.5 trillion globally by 2020, across segments such as energy, transportation, healthcare, building, infrastructure, and governance. This creates vast market opportunities in this part of the world for companies prepared to invest.

Across all these initiatives, the key to unlocking the potential of a smart city lies in the merging of the physical and digital worlds, the combination of operational technology (OT) and information technology (IT) to create an Internet of Everything that will generate vast amounts of data at a faster pace than anything we have seen in the history of digitization to date. All this data – machine data, business data and human data – will have to be analyzed at incredible speed to deliver the insights and enable the responses and services required for a smart city.


So what does this mean from an IT infrastructure perspective?  And more importantly are we ready?


The standard hyper-converged infrastructure that we have today is able to ingest and manage data through virtual machines (VMs) and is great for supporting Infrastructure-as-a-Service and applications on demand.  However with data in each VM containerized, it is not sharable between different uses. We also have big data analytics and Hadoop which can manage and process data in parallel, but only when it has been copied into the Hadoop Distributed File System.


What is needed for smart city applications is a new scale-out architecture that combines both to avoid the creation of data silos, while avoiding time consuming and expensive ETL (Extract Translate Load) operations by maintaining data in place. Such a platform will be able to ingest data at speed, spin up VMs on demand to execute applications that share data between all VMs, and execute Hadoop jobs directly at the data source alongside others.


This is precisely why we recently announced the Hitachi Scale-Out Platform, not only does it have all the attributes above, but together with our integration with Pentaho, blending data sources and visualizing them doesn’t require separate siloed infrastructure and can be easily spun up. Furthermore, it connects into easily existing enterprise applications like SAP S/4HANA though through connectors.



Technology companies will play a key role in delivering this new architecture and meeting other requirements of a smart city. Governments do not have all the experience nor funding to embark on these initiatives on their own so partnering will be a necessity for success.  Companies like Hitachi, which have been involved in multiple levels of the smart city supply chain, will be the catalysts for making smart cities a reality.


Hitachi has invested some $1 billion in IoT and big data R&D to date, and is a leader in data analytics and technology patents. It is bringing its entire arsenal to bear to deliver smart city solutions that address urban challenges such as healthcare, transportation, energy, logistics and many more. With end-to-end expertise and partnerships covering everything from the underlying technology products to the delivery of services, smart cities will be built by these smart companies.

What began 10 years ago in the retail and ecommerce space, sparked off by the likes of and eBay, is now here in every sector of the economy.  Digital disruption is not only coming, its already here!  We know this when new start-up companies grow customer acquisition and value faster than traditional ones in a matter of just a few short years.  So how are they doing it when the worlds largest taxi company (Uber) owns no taxis, the largest accommodation provider (AirBnB) owns no real estate and the worlds most valuable retailer (Alibaba) owns no inventory? The answer is they are focusing on transforming the customer experience, not building their brands.



In my top trends for Asia Pacific in 2016, I headlined my predictions with traditional Enterprises will transform in digital natives.  So the question is how should traditional enterprises respond?


In the face of constant exhortations to “digitise or die”, it is important for traditional businesses to understand that digital transformation presents an opportunity for them to pull ahead, rather than a threat of extinction.  And in addition to providing customers the experience they expect, they can improve their productivity and profitability.


Harnessing digital transformation for productivity involves creating a workplace culture where the digital generation – the Gen Ys and emerging Gen Zs – can thrive. This means empowering them with new digital tools for collaboration and workplace productivity, and being open to new practices such as the crowdsourcing of ideas to drive innovation and value creation. This will help businesses to attract talent and address the skills shortage and productivity issues that are hampering growth.


Digital transformation also presents enterprises with new avenues to pursue profitability. Here in Asia Pacific an excellent example is the Online-to-Offline (O2O) opportunity, which is about creating an online presence and using it to drive customers to offline channels of engagement such as physical stores and events. Consumers today, for instance, are more likely to buy a product based on friends’ recommendations on social media than a billboard that is placed along a highway.


The quest to bridge the gap between online and offline commerce is set to be a dominant force in Asia’s retail landscape in 2016. According to a Forbes report, the Chinese O2O market is more than $150 billion in size with just 4% Internet penetration, pointing to a huge potential for growth.


Mobile wallets play an important role in the O2O value chain. Besides being a convenient mode of payment, they are also able to store points, cards, coupons and other loyalty incentives to drive O2O traffic. Traditional financial institutions are already making their move in this space. HDFC Bank, a leading card issuer in India, recently announced the launch of a digital wallet and an electronic marketplace for online merchants.


IDC is predicting a “mega transition” in digital transformation in 2016. Enterprises will “flip the switch”, committing to a massive new scale of digital transformation and 3rd Platform technologies (mobile, social, cloud and big data) as they jostle for leadership positions in the hyper-digital economy. By the end of 2017, 60% of the top 1,000 enterprises in the Asia Pacific will have digital transformation at the centre of their corporate strategy, said the research firm.


Like private sector organisations, government too is not immune to digital disruption. Singapore recently announced the restructuring of two statutory boards to form the Infocomm Media Development Authority as part of a concerted effort to seize the opportunities of ICT-media convergence and ride the digital wave. In Australia, the federal government has signalled that it is serious about digital, with the establishment of its national Digital Transformation Office.


The pace of digital disruption is accelerating, and we have seen how it is impacting every function across the organisation, from HR to marketing to finance. Digital transformation cannot, therefore, be the sole responsibility of the CIO or the IT function; it has to be a team sport. And like all team sports, when the objectives are clear, morale is high and every player pulls their weight, the chances of scoring will be high. Remember, digital disruption is not a death knell but a wake-up call.

Have you ever wondered how many times a day you check your smartphone?  Well one recent study suggested it could be as many as 84 times a day!


Surprised?  Well you shouldn’t be because here in Asia Pacific smartphones have become intertwined not only into our every day lives, but into the fabric of the culture of the 2.5 billion people who have them.  In fact in Japan it has spawned an entire subculture with its own name - keitai culture.


Using apps for instant messaging, social networking and search are so ingrained into our everyday routine that we may not even notice its prevalence anymore.  But how does the ubiquity and rapid adoption of this technology affect business?


Technology now drives business, not the other way around and to help answer this question, here are my top 5 predictions for Business and Technology in Asia Pacific for 2016;


#1: Traditional enterprises will transform into digital natives

Digital transformation, or commonly referred to as DX is fast becoming THE number one organisational issue.  With almost no industry immune from disruption by fast moving agile start-ups, traditional industry players have realised that it is more than just survival, but its about sustaining in the next era.  But it is no longer just CIOs who are championing the need for digital change, but leaders across all business functions. For example, CMOs are finding that traditional ways of marketing are not as effective any more, while CFOs are discovering that customers are demanding different transaction models with more convenient payment options. There is now an almost universal understanding within businesses that all functions need to work together look at how they transform their own practices through digitization.


A recent study conducted here in Asia Pacific highlighted while 37% of companies are already undergoing digital transformation, almost 60% plan to embark on the transformation journey over the next 12 to 24 months.


#2: Smart cities will be built by smart companies

Smart Cities have been a topic of interest for a long time in Asia Pacific, with many countries in the region rolling out their own initiatives to tackle everything from public safety to improved transportation. However, few governments have the capabilities, or the capacity, to upstart these initiatives on their own and are instead relying on innovative integrators to create and develop the solutions needed to make these cities a reality.


Smart companies will be the catalysts for making Smart Cities a reality, as governments open the door through initiatives like Digital India, Smart Nation Singapore, and Digital China. The business opportunities for companies in the sector are huge, with the annual smart city investment in technology alone set to quadruple to $11.3 billion by 2023.


#3: Business silos will be unified by cross modal IT

Until recently, there have generally been two modes of applications available to business.


Mode 1 are applications that handle traditional systems of record – such as CRM and e-commerce systems. These systems are built around predictability, accuracy and availability, given the sensitive data they hold.


Mode 2 are systems of insight, that are more exploratory. They give a picture of what is going on inside a business – enabling users to layer data sets over each other to see if they match a certain hypothesis. If they don’t match, the organisation can throw them away and quickly try out new ones.


With the need to optimise Mode 1 IT with faster deployment and greater automation, as well the Mode 2 applications being embedded into the customer experience, the systems and skills required to run both two modes need to come together. The result combines the stability of traditional systems with agility and speed, and this is something HDS expects to continue to grow next year.


#4: Trans-regional business will be enabled by multi-cloud

With the foundation of the Trans-Pacific Partnership now set, the Asia Pacific technology market will accelerate efforts to enable businesses across the region take advantage of this enormous economic opportunity.


This opening up of the market will create new opportunities for businesses consume cloud services and expand the options they have when it comes to building hybrid clouds. With a more seamless trade and investment environment, companies will have the ability expand their market opportunities with less barriers. From an IT perspective, Enterprises will look for more flexibility over the cloud services they use and want to locate and potentially migrate to different regional service providers.


Several companies are already building new data center capacity in places like Singapore, Hong Kong and India while others are investing in improving cross-border connectivity; creating direct routes between areas like South East Asia, Australia and the United States of America so all economies can trade though high bandwidth, low latency connectivity.


#5: Skills shortage will spark a talent pursuit

As companies strive to transform their into Digital first businesses, the skills shortage will magnify particularly in areas of social media, cyber security and analytics.


Governments are trying to enhance access to skilled IT labour market by introducing new tax incentives and passing laws to allow for easier investment in start-ups, such as through crowdfunding. Evolving the skills and productivity of existing employees is also another focus for governments, with Singapore investing $1.2 billion in technology development to drive improvements within its public sector.


Addressing the IT skills shortage will not just be about pumping out IT graduates, appealing to the interests of the best young talent while increasing the productivity of existing employees.  The working practices of ‘Gen Z’ emerging workers are vastly different from those before them. With this generation expected to work an average of 17 jobs in their lifetime, they will pick up a greater variety of skills during their careers, but they also expect to be stimulated by what they do.


Over the coming weeks, I will crack open each of these trends in further blogs to share some insight into how some of the smart companies are leading these exciting changes.  On January 19th, I will also be hosting a live webinar where we will discuss these further.


In the meantime, I wish everyone a happy festive season and prosperous new year!

In the mid 90’s I used to hang out at Bennett’s Lane, a popular club tucked away in one of the quaint laneways of Melbourne to listen to the cream of the city’s jazz scene.  Sadly now closed, what used to intrigue me the most was how seemingly un-coordinated musicians playing eclectic instruments produced such wonderfully orchestrated dulcet tones.  Mastering jazz is one of the toughest feats for a musician, despite its improvisation, it’s actually based on principles of following common chord tones and tensions.


So what does this have to do with IT?  I’ll get to that…


Fast forward to 2015, the noise around containers has reached fever pitch over the past six months. As companies like Docker and CoreOS step up to battle their vision to the technology faithful, infrastructure vendors are scurrying to demonstrate that they are hanging out with the newest cool kids on the block.  Along with all this hype, there has been an equal amount of confusion comparing and contrasting them to virtualization.  Depending on which side of IT you come from, it could be difficult separating the two, however here is a good article I found that does a good job.


Religious wars are synonymous in Enterprise tech, some of them reach legendary status (how can we forget the browser wars of the 90’s!).  But in the real world of business, the CIO’s I speak to generally care about two things – how do we optimize our existing IT environment? And at the same time how to we ensure we are innovating to meet customers ever increasing expectations?


The first inevitably deals with managing cost by modernizing infrastructure, automating processes whilst still balancing organizational compliance and risk.  The second is very different, it’s all about responding to rapidly changing market pressures, new competitors and escalating user demand so the aim of the game is assembling services quickly from a variety of places.


In amongst all of this, the other force CIO’s are contending with is how SMAC (Social, Mobility, Analytics, Cloud) can be best leveraged. These nexus forces offer the opportunity not only to better align cost to revenue, but rewire companies to give them the competitive edge in disruptive business conditions.  To truly capitalize on this, IT needs to elevate its value within the business which is not such an easy thing to do.


Gartner coined the term Bi-Modal IT to describe the modern organizational model of IT with two discreet sets of operating principles.  Somehow, many people have implied that the first mode deals exclusively with enterprise applications whilst the second for new generation cloud native applications.  Furthermore, when it comes to building infrastructure for them many vendors have taken the approach that they should be siloed.  I question this and say why?


This great blog post from Simon Wardley draws an interesting analogy of this movement to that of pioneer’s, settlers and town planners. I find myself agreeing with him when he says “work taken from the pioneers and turned into mature products before the town planners can turn this into industrialised commodities or utility services.  Without this middle component then yes you cover the two extremes (e.g. agile vs six sigma) but new things built never progress or evolve.”


With the attributes of Mode 1 & 2 understood, why must they be applied mutually exclusive?  Why can’t enterprise applications introduce agile development practices to augment the reliable quality assurance processes of building software? When it come to governance, why can’t we reduce the component level iterations of continuous delivery practices by incorporating a better understanding of holistic requirements offered by the waterfall model?  And given the interdependency and complexity in delivering the ideal user experience, shouldn’t we harmonize the teams responsible for delivering IT by cross skilling both sides?  Surely the aim should be to build Cross Modal IT?


As a strategy, building discrete information infrastructure and service delivery capability for these two modes of IT with diverging tools, processes and skills won’t achieve the CIO’s imperative of optimizing and more than this, is just plain dumb.  Not only will doing so throw the data center back to spaghetti central of yesteryear, but is actually counterproductive to the goal of gaining agility across all facets of IT.  Traditional applications that are still very critical to business functions need to innovate beyond simple re-platforming onto new infrastructure, but seek to evolve to leverage new services.  Likewise, cloud native applications need to do more than spin up features quickly, they need to be reliable and trustworthy.


So what are some examples of how we can leverage the best of both worlds to create a better optimized yet innovative IT practice;


For Enterprise applications, doing test/dev or bursting into off premises/public consumption based cloud can fast track deployment as well as significantly reduce costs by better aligning (opex/capex) to demand.


This is why HDS introduced cloud tiering capabilities into HNAS and HCP and integration of UCP to VMWare’s vCloud last year. Giving organizations the ability to extend public cloud services to traditional enterprise applications for things like disaster recovery, backup and archiving.  This not only modernizes the environment, but can enable new things like analytics to performed more easily.


Likewise, the ability to run native cloud applications with full portability within the safe confines of a private cloud or where turnstile cloud pricing can be quite prohibitive is another big benefit.  Here in Asia Pacific, tight regulation of industries like banking, demarked political and economic zones and bandwidth restrictions have made in increasingly clear there is no one universal cloud across the region.  Furthermore, after years sustained rounds of price cuts from public cloud providers, last month we saw the first round of price hikes prompting organizations to reevaluate their costs and risks.  Organizations don’t just want choice, they need it which is why so many CIO’s are opting for Hybrid-cloud strategies.


Last month HDS announced support for Google Kubernetes on the Unified Compute Platform, giving developers not only the ability to mobilize their applications and microservices on their cloud of choice but also using a single toolset to abstract it.  Coupled with tools like UCP Director which offer the same visibility, orchestration and automation benefits for underlying infrastructure, developers can use the same API’s to consume it as a service without changing code. More than this, they can extend many of these benefits to enterprise applications too.


Although the integration of infrastructure and tooling doesn’t address everything (fusing agile and traditional service delivery practices is arguably the toughest nut to crack), HDS’ vision is to help organizations move toward Cross Modal IT. By making sure those cords and tensions in the IT foundation work harmoniously together from the outset, the equally important goals of optimization and innovation can be accomplished in concert.


Can anyone recommend a good jazz club?

(This is a continuation of a previous blog)


Have you ever been surprised by how much your bank knows about you when speaking to a customer service representative? If you have been unlucky enough to have your mobile phone lost or stolen, how long does it take before a cold sweat breaks out when you realise your entire life is sitting on there? And how many times have you Googled someone to find out more about them?


The fact is each time we use a service, take a photo or write a comment on social media we leave a digital footprint about who we are and more often today, where we are.  There is no doubt the digital age has made our every day interactions more convenient, but at what cost? Over the past few years we have witnessed an acceleration of personal identity fraud, targeted hacking of high profile orgnanizations like Sony and Apple, as well as increased surveillance between nations such as the scandal that engulfed Australia last year with Indonesia. With such breaches making headlines almost every week, consumers and citizens are rightfully questioning exactly how their personal information is being protected? 


Technology is more than just pervasive within our society, it's deeply embedded and largely invisible.  The livelihood of individuals, the reputation of companies and indeed the confidence in economies depend on adequate custodianship of data. We have reached a tipping point where digital privacy can no longer take a back seat. This serves as my final prediction for the year ahead;


As information technology drives more implications for personal privacy, business will increase investments to address compliance


Over the past two years, data privacy regulation has really come of age across Asia Pacific.  Multi-national organisations operating in the region are the most concerned, with a recent study revealing almost 4 out of 5 believe privacy or data protection represent the biggest risk to regional laws.  To allay such concerns, Governments have been responding by stepping up policy.  Although mature countries like Australia, New Zealand, Hong Kong and Japan passed data protection laws in 1990's, many of them have been updated recently to deal with increasingly digital access practices.  We have also seen countries like Singapore, Malaysia, the Philippines, South Korea and Taiwan enact new privacy laws in the past three years that enforce the treatment of personal data.

The Chinese central government has passed a raft of new legislation to encourage compliance to data confidentiality rules, but broader state secrecy laws still represent confusion for many for businesses. For example, state secrecy laws prohibit the transfer of information deemed to be of economic value, however there is little qualifies as a "state secret".

Over in India, rules and standards have been defined to data privacy laws introduced back in 2011 to help increase adoption.  In particular, Section 43-A and 71-A of the act which primarily deals with compensation for negligence in implementing and maintaining reasonable security practices as well as procedures in relation to sensitive personal data or information (SPDI) strengthen the legal requirements on commercial organisations.


In the past, the priority for organisations was to protect their own commercially confidential data by safeguarding valuable intellectual property.  However in the new era of privacy, doing this is not enough.  In order to secure the confidence of the people that use your products and service as well as the reputation of your brand, you need to demonstrate you are following best practices in protecting your customers personal information too.

So where are the biggest sources of privacy breaches and what can we do to better protect personal information?


Traditionally, the vast concentration of personally identifiable data has resided within the databases of enterprise applications like CRM's, eCommerce, Marketing, HR or office applications (ie. email, documents and spreadsheets) on file servers.


However smartphones have emerged as the biggest source of valuable personal information.  Unfortunately, they also happen to be one of the least secure devices with a recent study uncovering 85% of mobile apps are not up to scratch.

But protecting personal data involves far common IT security practices which focus on deploying solutions to secure applications in the data centre and the perimeter.  It requires a much broader approach.  Here are four steps I recommend;

1. Assess your risk

Start by conducting an audit to assess your organisation’s state of compliance.  Study the latest laws and regulations and ask each of the company functions what personal information they collect, manage and store. From there, understanding your exposure and what policies should be applied (ie. Retention, disclosure and disposal) form the project of works.  Then mapping that back to the business applications that manage them will help you assess the scope, timeframes to address and budget.

2. Review your technology capabilities and identify gaps

Your information systems can help you maintain compliance.  Assess your current technology capabilities and determine how you can leverage unused features or functions to automate the process.  For gaps, conduct some research software that can augment your existing environment to improve privacy compliance.  For example, object-storage solutions like the Hitachi Content Platform can help preserve and protect the most critical data and provide audit logs of changes.  Private file sync and share solutions like HCP Anywhere provide end to end encryption over public networks thwart network sniffing. Application-independent search software like Hitachi Data Discovery Suite can greatly accelerate the time and reduce the lost of e-discovery.  And data migration software like Data Migrator on Hitachi NAS can help automatically dispose of data securely when its no longer needed.

3. Create a Privacy Policy and publish it

Privacy and compliance is as much about people as it is about processes and technologies.  Document your privacy processes and communicate them to you employees to create a culture of compliance.  Employees need to understand how important it is to the company and be educated about the mechanics of how it affects their job function. They also need to be advised about how external information collection systems (say from 3rd party service providers) need to managed.  Publishing your privacy policy to your customers is just as important, as they need to be reassured of the way you manage .

4. Appoint a Privacy Manager

Maintaining a culture of compliance is not a one off initiative, it needs to be ingrained into the way the organization does business.  The role of the Privacy Manager is to take responsibility of the practice beyond the implementation phase.  They ensure the processes are being followed, policies are regularly reviewed to keeping pace with changes, inspect that systems are audited, act as an escalation point for customer disputes and resolution, and lead the continual training across your organization.

Last year, I wrote a blog post on the new Privacy Act changes in Australia and talked about some of the technology capabilities within Hitachi Data Systems portfolio that can help with step 3.

(This is a continuation of a previous blog)


Although Zemeckis fell short of predicting how a smartphone would transform day to day interactions in the movie Back to the Future II, he did manage to pick up on the prominence of mobile devices and the current wave of wearables.  In various scenes throughout the movie, we see interesting technology that looks vaguely familiar to Apple Pay and Google Glass.



This shift to mobile consumption has done more than relegated to humble PC to the back of the line, it has fundamentally changed the way we build applications for these platforms and the architecture of the infrastructure that supports them prompting my next prediction;


#4 - The mobility explosion will prompt supporting information infrastructure to be more data-driven.


Asia Pacific is already the world largest mobile region, with analysis showing 1.7 billion unique subscribers in 2013 making it half of the global connected population with another 750 million to come over the next 5 years.  The launch of high-speed LTE and 4G services across the region are unshackling users from notoriously poor mobile internet coverage and super charging bandwidth.


The ubiquity of mobile networks coupled with the fact social media platforms are becoming the “new OS” are giving rise to new ways to engage perspective customers.  Targeting consumers with individualised offers based on an increased understanding of their preferences, relationships and location is now the aim of the game.  The trend is clear, out of the 597 million active users of social media across China in 2013, 38% based their buying decisions on recommendations on others comments online.

Here in Asia, this transformation is already well under way.  We have seen the popularity of services like YAY in India, LINE in Japan, KakaoTalk in South Korea and WeChat in China introducing new ways to book taxi's, buy products online and even apply for micro loans.


In this hyper-connected ecosystem, building mobile applications with PaaS (ie. CloudFoundry, Azure, OpenShift and Hadoop) can not only accelerate the development cycle but bake in real-time decision making and machine learning very easily.  Furthermore deploying them on the cloud gives them unprecedented scale and agility, but there are other factors to consider. 

Today's applications are being re-platformed, replaced or in the case of agile practices being evolved faster than ever before to keep up with consumers expectations and competitors.  As organisations seek to find new channels, extract greater business value and monetize their information, they need to think about how to manage the most precious asset - the data!  And lets face it, it all about the data.


So how do we build application and infrastructure architectures that are data driven?


The obvious place to start is making sure the data you need is available.  Availability goes beyond uptime, with the vast majority of data needing to be analysed being the unstructured type, it's important that access is universal, permissions are protected and fast enough to be responsive.  This is why many organisations have turned to object storage platforms.

The Hitachi Content Platform (HCP) not only provides multiple modes of access but features extensible meta data and REST API’s to enable linkages to the other data sets.  To ensure isolation and access authentication, HCP features secure multi-tenancy of applications and plugs into LDAP's and security services like OpenStack's Keystone.  Recall of data is critical, with this study suggesting almost half of users would stay clear of an app if they experienced performance problems.  HCP's support for different storage classes and tiering capabilities for hybrid cloud ensures data is sitting in the appropriate place for the access profile.  This is especially important when doing analytics on large data sets.


In an all mobile world, access of this data can be from virtually anywhere, crossing not only metropolitan boundaries but continental ones too.  Therefore, extending these attributes intelligently outside the four walls of the data center not only automates arduous tasks like data movement, but introduces elastic scale to your content by easily making data services consistent to deploy.


The Hitachi Data Ingestor (HDI) works together with HCP to effectively provide an access gateway for remote and branch locations, as well as being a data on-ramp to the cloud thereby removing the need for data locality from the application.  Similarly, HCP Anywhere is another snap-on to the HCP suite offering file sync and share capabilities over wireless networks.



The strength of Hitachi's Content and Mobility portfolio has always been the tight integration between all these modes of mobility.  Whether consuming data inside a captive data center, across multiple locations or geographies, or on consumer mobile devices, the access and data management policies are the same.   You can even choose how to deploy it, either as an appliance or as software on the cloud.  For more details of the recent enhancements to the products and portfolio, read here.


Back in December, IDC ranked HCP firmly in leaders quadrant of their MarketScape on Object-based Storage Platforms saying HCP addresses many scenarios "customers are looking to accomplish in the areas of hybrid cloud and workforce mobility while retaining visibility and control over their digital assets".


In closing, whether you are building a data-driven experience for your employees or customers, not all infrastructure is created equal for serving this type of workload so choose wisely.

(This is a continuation of a previous blog)


In the movie Back to the Future II, the only clouds we see are the white fluffy ones that exist up in the sky as the flying Delorean descends onto Hill Valley rather than computing variety.  However the old Apple Macintosh Marty see's in the shop window of the antique store implies the world of computing has moved on in 2015 from the humble desktop personal computer.


Cloud computing and the shift to consumption based IT is not only changing the entire economics upon which IT infrastructure is procured and capitalized, but revolutionizing the way in which applications are being built and deployed. According to Gartner's Hype Cycle for emerging technologies, cloud is in the tail end of the trough of disillusionment and pre-eminently in the slope of enlightenment.  Regardless whether you agree with where it sits, cloud is all grown up and here to stay, prompting my next prediction;


#3 - Hybrid Cloud will emerge as the preferred way to deploy Enterprise Applications


Savvy CIO’s have been steadily marching many of their enterprise and mission critical workloads onto private clouds, especially apps like relational databases, email, ERP and VDI.  On premise converged or integrated systems have made it easier to re-platform applications which require high availability whilst introducing some much needed efficiencies in provisioning, automation and management.  The trend is clear with IDC estimating over 3,200 integrated infrastructure units shipped in Asia Pacific (excluding Japan) last year, this was a 140% increase over the year before.


At the same time, we've also seen a number of large enterprises experimenting in public clouds.  Employing the elastic compute and storage capabilities together with PaaS services make it ideal for transient workloads and web scale applications where its difficult to predict scale.  Qantas, the Australian Airline recently talked about their experience of testing its engineering support applications in public clouds and seeing a 10-fold step change in the pace at which the business can deploy new services.


Although cloud delivery of IT is helping increase agility and reduce operational costs, the ease in which instances can now be spun up is also propagating “cloud sprawl”, leading the business to ask a number of pressing questions; Are workloads being adequately placed to meet governance, security and economic requirements?  With some cloud services (especially compute and storage) prices dropping and providers consolidating, how can I avoid vendor lock-in and easily transport workloads to another provider? With cloud skills in short supply, how can I avoid maintain expertise on every cloud platform?


With these considerations in mind, as well as the fact that cloud platforms have reached a level of maturity, the stage is set for organizations to evolve their enterprise applications on a mix of Private and Public clouds.  Both cloud delivery models offer value, but bring a different set of attributes to the party.  Shifting everything to just one type of doesn't make a lot of sense but examining how your applications can optimize the use of both does.  Services which integrate both flavours to deliver a seamless Hybrid Cloud experience will not only help businesses realize greater agility and cost alignment, but also ensure you meet governance requirements.


So what does a Hybrid cloud actually look like?



Managing public and private cloud instances has been disjointed to say the least, inconsistent programability and incompatible functionality between them has made the job tough.  However recent enhancements to hypervisor management tools as well as Cloud Management Platforms like Citrix, ServiceMesh, Flexiant and others can help normalise the service experince.  Vendors like Microsoft with Windows Azure Pack and VMware with their rebirthed vRealize and vCloud Air (previously vCHS) now allow you to seamlessly provision, manage, monitor and move instances between on premise private and their public clouds not only with a degree of simplicity, but elegance too.


Back in October, Hitachi Data Systems announced new versions of the Unified Compute Platform for VMWare’s vCenter as well as Microsoft’s System Centre, enabling greater automated of both with UCP Director software.  Organisations looking for a fully managed service can now get Compute-as-a-Service to complement our Storage, Archive and Backup as a Service. Going further, we also announced a strategic worldwide partnership with Equinix, offering our managed private cloud services off-premesis in their International Business Exchange™ (IBX®) data centers. Co-locating in the same facilities as the public cloud providers allows you to leverage cross connect services like AWS Direct Connect and Azure's ExpressRoute to get performance assurance and reduce telco costs.


Hot on their heels is OpenStack, the open source alternative which is now in it’s 10th release (Juno) as a number of large companies rolling it into their enterprises.  Its interesting to see China not only boasts the largest user group community with over 2,000 members, second only to the United States, but a number of local service providers like and FusionCloud are already in commercial operation with the platform.  In Australia, the Government is funding a cloud that is making it easy for 55,000 researchers across disciplines to access IT resources, collaborate, and share their findings. The NeCTAR Research Cloud, has over 20,000 cores running with OpenStack.

Hitachi Data Systems is not only a gold member of the OpenStack foundation delivering drivers for the various projects, but actively contributes to the open source community with development in the Linux Kernel Virtual Machine (KVM) helping make it enterprise grade, a sponsor of the Open Source Development Laboratory (OSDL) and sharing projects like the Custom Meta-Information Object Enhancement Tool (COMET).


There is still work to do in realizing full portability of application instances in the cloud.  The next battleground shaping up is lightweight containerization to help with application deployment, versioning and maintenance.  Contested by technologies like Docker, Kubernetes and now Rocket, it will be an interesting space to watch this year.


Moving application instances between clouds is one thing, but transporting data, especially large sets, seamlessly and non-disruptively across them is another.  Whether you are looking to move cold data like protection copies or active content from remote sites with limited bandwidth into the cloud, automating operations through data management policies reduces effort as well as helps maintain control and flexibility.


Last year, Hitachi Data Systems released new enhancements in the Hitachi NAS (HNAS) and Hitachi Content Platform (HCP), enabling not only the the tiering of content to your public cloud provider of choice through support for S3 and REST.  This not only allows you to migrate between them to avoid vendor lock in, but keep things like encryption and meta data control within your private cloud as part of your enterprise security regime.


This coming year you will see us accelerate our enhancements across the portfolio with some new products working natively in the cloud, flexible deployment options for existing products and new consumable services.

(This is a continuation of a previous blog)


Continuing on our Back to the Future parody, you may remember the scene when Marty McFly, posing as his future son walks into the Café 80’s.  He is confronted by two artificial avatars, Michael Jackson and Ronald Regan competing to take his order.  Although we don’t have Max Headroom-like cyborgs playing maitre de in restaurants (but we can draw comparisons to Apple’s Siri), this scene illustrates how machines can be employed to improve the customer service experience.  Giving rise to my next prediction;


#2 - Competitive industries will ramp up Big Data initiatives to get competitive advantage


Although adoption of Big Data across Asia Pacific remains low with over half of organizations making limited headway, particular industries are are ahead of others.  Certainly businesses operating in competitive industries are no longer looking at Big Data as an initiative, but as an imperative.  As initial projects show promising new insights and customer engagement, competitors are making similar investments to drive a new “arms race” in key verticals.


The majority of big data adopters now appear to be banks and other financial services firms.  Employing correlated analytics on in-house data to assess things like borrower risks, churn detection and cross/upselling other products based on spending behaviours have helped a number of financial institutions retain their valuable customers as well driving more wallet share from them.


But this is only the beginning, as we are now seeing the next wave of projects which will provide even greater insight through mashing even more data sources.  An example of this is MasterCard’s intent to mine FaceBook from its Asia Pacific user base to uncover behavioural insights to sell back to the retail bank’s. This “insight enrichment” concept demonstrates how in-house personal data, public social media data and geographical location from mobile devices will enable a new level of customer engagement and revenue potential.


In the telco world, with 4G expecting to drive 14.5 times more data traffic than non-4G connections, Mobile Network Operators (MNO’s) are investing in software to analyse wireless data in flight to optimize networks for demanding content delivery like video. In this highly competitive market, the benefits are clear as ensuring quality of service reduces customer churn.


In countries like Indonesia where 168 million people are connected to the mobile network, service quality is a necessity to stay profitable.  This is where companies like PT Telkomsel is putting its data to work.  By understanding its customers usage patterns, they are using real time analytics to identify the ‘next best offer’ for their subscribers, with a view to moving customers onto higher yield service plans.  This is resulting in 2 million customers per month are being upsold to broader mobile plans.


Governments are also realising the potential of big data to improve citizen services.  With open data initiatives in countries like Australia, New Zealand, Singapore, South Korea, India and Hong Kong, thousands of datasets are now available for public consumption absolutely free.  For example in Singapore, data from travel cards, GPS on trains and buses and openly available data on schedules were used to draw up a detailed model of how the residents of city move through their transport system, resulting in a drop of peak hour travel of between 7% and 13%.


This next generation of business intelligence solutions will not only require new infrastructure architectures to store and manage vast data lakes of information, but combining industry specific data interpolation together various interconnected software platforms will be critical to delivering predictable and supportable projects.


In the world of Big Data applications, ingest of data can come from many places and in many forms.  Collecting bulk data (ie. enterprise data warehouses) though means like ETL versus streaming data (ie. machine/device data, social media feeds) where event and metadata need to be extracted require different profiles of network and compute bandwidth.


Similarly, storing this data is not as trivial as sticking into a traditional relational database.  The massive size and frequency of such data streams require things like Distributed filesystems, Memory Grids, Key-Value and Graph databases, in some casesall of them to help manage data at such scale.  As we see open source platforms like Hadoop gain greater adoption in the enterprise, organisations will be looking for support to implement, integrate and maintain such analytic systems.  This has led to a number of solution providers like MapR, HortonWorks and Cloudera  to expand their presence across Asia.


Processing and computing the volume and velocity of Big Data mean that traditional server architectures are inherently inefficient with this type of workload.  This is leading to a different class of server hardware specification, one which is optimized for in memory processing, scale out modular interconnects and software-defined provisioning and management programmability.  Projects such as the Open Compute Project and Novena have accelerated the development of these platforms, but enterprises will need a greater level of support to put them into production.


From a software perspective, PaaS can help accelerate many of the development and operational requirements of Big Data applications.  Much like the IaaS space, this is becoming an increasingly noisy with vendors like Pivotal, RedHat Openshift, Microsoft and Amazon all vying for their platforms to gain the greatest amount of market traction.  When it comes to visualization of data, organizations are looking abstract this layer as much as possible rom their underlying data management layer.  In many cases business are already using products like Pentaho, Tableau or Datawatch and would like to leverage these familiar user experiences in their enterprise.


Hitachi Data Systems has been partnering with SAP for a number of years to deliver one of the industry’s most scalable Hana appliance on the market.  In southern Australia, the power distribution industry depends on it for delivering analytics for smart meters, helping consumers understand their electricity usage behaviours.


In 2015, you will see even more integrated solutions from Hitachi Data Systems across a variety of industries that help organizations accelerate their adoption of big data.  This will not only be in the form of products, but services and local partnerships across the region so watch this space.

(This is a continuation of a previous blog)


In Back to the Future II, Zemeckis cleverly depicts the interaction of people and the things around them.  Scenes like the Jaws 19 holographic advertising, voice activated automation of the McFly home and seeming instant policing when Biff crashes into the Clock Tower gave us a fascinating look at what is possible when our urban environment responds to us.  Even if our reality today is not quite the same, we are on well on the way to intelligent things and this inspires my first prediction;


#1 - Smart City initiatives will drive greater investment in the Internet of Things


Asia Pacific countries are amongst the largest and fastest-growing urban areas on the planet.  It also has some of the worlds most underdeveloped infrastructure, densest cities, fastest growing energy consumption, busiest transport routes, active natural events and arguably most at threat of climate change.  Last year, the Smart Cities Council assessed 129 cities across Asia and Africa most at risk from interventions to address many of these issues.


These pressing demographic and social developments have led a number of Governments including Japan, India, China, Sri Lanka and South Korea to embark on Smart City initiatives to tackle these urban challenges, mange energy and resource consumption and prepare for further growth.  In India, Prime Minister Narendra Modi has promised 100 smart cities and industrial corridors to make India a manufacturing hub.  In China, the Ministry of Housing and Urban and Rural Development has selected 193 local governments and economic development zones as official smart city pilot project sites, making them eligible for funding from a ¥100 billion ($16 billion) investment fund sponsored by the official China Development Bank.  In Singapore the Infocomm Development Authority (IDA) trialling a number intelligent solutions in the Jurlong Lake district as part of its smart nation initiative.



These nation building initiatives across the region will propel significant momentum in the development of intelligent social infrastructure solutions that combine Mobility, Advanced Analytics, the Internet of Things (IoT) and Machine to Machine (M2M) interaction.


The IoT market is forecast to top US$9.96 bn across the region in 2014, growing at a CAGR of 34.1% to reach US$57.96 bn by 2020. Global M2M adoption increased by over 80% in the past year to reach 22%, but in Asia Pacific, Middle East and Africa the growth was 27% surpassing the established markets of America and Europe.


To manage the expected 50 billion devices that will be connected to the internet in the next 5 years, and the new generation of application platforms that will drive them, a different kind of infrastructure topology is emerging.  Not only will this data accelerated workload require an unprecedented scale of computing, network and storage optimised to deal with the volume, velocity and variety of data they consume, but will be fundamentally built on a new set of attributes.


Connecting everyday devices and appliances to the Internet so they are able to communicate with services and to each other is important.  Even if standards for the Internet of Things are yet to be settled, MQTT and CoAP appear to be the front runners are most importantly, open to springboard their adoption. When it comes to appliying this to cities, the British Standards Institute recently released their proposal for the Smart City Concept Model (SCCM), PAS182 providing a basis for the interoperability of systems and data-sharing between agencies.  Hitachi have almost two decades of experience connecting things to IT systems through our early pioneering days in RFID.


To deal with the scale of connected devices, velocity of data ingestion and computation and storage power required to process and delver back services, a there needs to be a paradigm shift in the type of core infrastructure provided.


Web Scale IT, pioneered by the large cloud services providers such as Amazon, Google, Facebook has emerged preeminent model to build IT for the scale of Smart Cities.  Once the domain of the global giants who are rich in engineering skills, solutions are now reaching the enterprise.  In fact one analyst believes that as many as half of enterprises will be operating this model in the next two years.


As a new category of integrated system, hyper converged systems are characterized by their attributes for scale out resources (compute, memory, network, storage), programmable management and extreme fault tolerance.  Not only do they deliver the scale and reliability of such demanding workloads, but are more agile and adaptable to changing business needs.  Abstracting underlying hardware resources and bringing powerful hypervisor control together with network and storage functions, applications will be able to leverage a much richer way of orchestrating and automating infrastructure for its needs.  Although it is still early days for Software Defined Data Center technology, the ability to implement infrastructure functionality like storage in software and manage it through API’s will be a foundation capability for scaling smart city apps.


There is no shortage of new players for this adaptation of the technology, companies like Nutanix and SimpliVity provide appliances in various flavors and packaging.  We also saw VMware announce EVO: Rail in August leveraging its vSAN technology, with Hitachi bringing this solution to market next year.


Leveraging virtualization, hyper converged systems are nicely suited to a number of today’s storage challenges, however its really only the beginning as we see much broader uses cases for this technology next year.  We will see this architecture underpinning the IT infrastructure for many intelligent infrastructure solutions, particularly in Telecommunications, Healthcare and Public Safety.


The critical ingredient besides device connectivity and data centre infrastructure is of course the software.  Applications that put the data into context, automate the business rules, distill and refine the data to help users visualize information is what makes these solutions game changing.  This is why back in October, Hitachi Data Systems announced the acquisition of Pantascene and Avrio to bring end to end solutions in the public safety space.


With a global business in social infrastructure as well as Information Technology, Hitachi is in the unique position to leverage multiple of the disciplines required to deliver highly integrated solutions to Smart Cities.  In fact, Hitachi already has a number of smart city projects in places like India and China, and was last year awarded the Asia Pacific Smart City Solutions Provider of the year.

As I peer into my proverbial crystal ball to see what the future holds in Enterprise Tech for the year ahead, I can’t but help going back to my early teens to the year 1985 and reminisce one of my favorite movie of all time.  In the the smash hit Hollywood trilogy Back to the Future, director and creator Robert Zemeckis imagineeered what our world would be like in the year that now stands before us, 2015. Fan’s will recall in his second installment, the eccentric Doctor Emmett Brown who incarnated a time machine out of coupe propelled the teenage rock star wannabe Marty McFly in a flying Delorean thirty years into the future to give us this glimpse.


Despite Zemeckis’ loathe to create a film that predicts the future, he did manage to get a few things right!  And even if we don’t have flying cars zooming across our skies, those uber cool self lacing Nike’s or hover boards (there are a number of promising prototypes), innovations like video phones (Skype), controllerless motion sensing  video games (Microsoft Kinect & Nintendo Wii) and biometric identifying payment systems (Apple Pay & Hitachi’s Finger Vein technology) are very much part of our lives today.



Economists and even psychics will admit predicting the future is super hard to do, but arguably Zemeckis’ greatest achievement in this film is depicting just how much technology has become embedded in every day life.


Similarly, the interplay between business and technology appears not only more seamless than ever before, but critical.  As we have seen exciting new markets emerge, age old companies falter and consumer service expectations forever changed, the winners of tomorrow’s economy are those who are transforming today.


The Business Defined IT era is here, and the need for IT organizations to embrace the third platform that is built on mobile devices, cloud services, social networks and big data analytics is now.  The CIO must respond to these trends and become an architect and broker of business services rather than technology builder focused on data center infrastructure.


Here in Asia Pacific, although CIO’s have won the respect of the business, moving upwards may take more convincing.  According to a recent study from the Economist Intelligence Unit, almost nine-in-ten (89%) surveyed believe the CIO has a strategic role that goes beyond managing the IT function.  However nearly one-in-three (30%) respondents do not believe the CIO should be a candidate to succeed the CEO.


Together with Hu Yoshida’s top 10 technology predictions for 2015, I have set the time circuits on to identify how they will play out in 5 key social and business trends evolving in this dynamic region;


#1 - Smart City initiatives will drive greater investment in the Internet of Things

Asia Pacific countries are amongst the largest and fastest-growing urban areas on the planet.  It also has some of the worlds most underdeveloped infrastructure, densest cities, fastest growing energy consumption, busiest transport routes, active natural events and arguably most at threat of climate change.  The opportunity for Internet of Things and Machine to Machine interaction is now apparent, as a number of Governments across the region have committed to national initiatives to propel the proliferation of smart cities. Read more here.


#2 - Competitive industries will ramp up Big Data initiatives to gain competitive advantage

Although adoption of Big Data across the region remains low comparatively to other geographies, organizations operating in competitive industries are no longer looking at it as an initiative, but as an imperative.  As initial projects show promising new insights and customer engagement, other companies will make similar investments to drive a new “arms race” in key verticals.  Read more here.


#3 - Hybrid Cloud will emerge as the preferred way to deploy Enterprise Applications

As cloud platforms reach a level of maturity and established vendors and service providers in the region fiercely compete for market share, the stage is set for organizations to transform their core applications on a mix of Private and Public clouds.  Solutions which integrate both platforms to deliver a seamless Hybrid Cloud experience will help organizations realize better alignment of cost whilst at the same time meeting important privacy and compliance requirements. Read more here.


#4 - The mobile explosion will prompt supporting information infrastructure to be more data-driven

Asia Pacific is the world largest mobile region, with analysis showing 1.7 billion unique subscribers in 2013 making it half of the global connected population.  Over 750 million new subscribers are expected to join over the next 5 years.

The launch of high-speed 4G services across the region are helping remove limits on internet use and giving smaller businesses greater outreach to customers, and fundamentally changing the way in which we interact. Read more here.


#5 - As technology drives more implications for personal privacy, business will increase investments to address compliance

With technology now pervasive and Government’s introducing new or updated privacy regulations across Asia Pacific, organisations will be forced to place greater emphasis on their internal privacy policies and look to technology to assist them. Organisations that successfully transition to the new privacy-protected era will have introduced a culture of compliance to their employees, and made smart investments in their data collection and audit practices. Read more here.


Over the next few blogs posts, I plan to talk more detail about how these technologies will evolve in the context of business outcomes, as well also surface use cases and companies already leading the charge.


So buckle up and lets take this puppy up to 88 miles per hour…


UPDATE: We will be hosting a Google Hangout with an all star lineup of panelists on the 28th January.  Plan to join in the discussion!


Any given Sunday, if you drop by the De Luca household at dinner time you will find us tucking into a freshly made aromatic wood oven pizza.  Being from Melbourne, I openly admit to being a food snob, which means even my take out pizza needs to notch up at least a couple of Michelin stars.  But despite my Italian heritage, I am simply incapable of replicating the authentic southern taste.  For this I entrust Rocco, a Nepalese born pizzaiolo with 20 years in the business.  And with a two-year-old boy unable to stay still for longer than 30 seconds, dining at home is the only way to savour it.


So what has our Sunday night ritual got to do with the business of running IT?


According to Albert Barron, a software architect at IBM, it turns out it’s a great way to explain the different cloud models.  With some minor changes, I thought I would use it explain the different Infrastructure-as-a-Service (IaaS) models.



In the context of enterprise application infrastructure, the “Made at Home” model is analogous to rolling up your sleeves to deliver the base of hardware, flavoring it with right software and services for your business and eating it at your own data center.  Yes I realize how weird that sounds, stay with me.


Although the “Made at Home” model has been the mainstay of building infrastructure for the past two decades, in the open systems world much of this is quickly moving to integrated infrastructure and consumed as a private cloud.  The trend is clear: in 2013 while shipments in servers and storage were subdued, IDC estimated over 3,200 integrated infrastructure units were shipped in Asia Pacific (excluding Japan) alone, representing a 140% increase over the year before.  You take a vendors ingredients and recipe and cook it your own data center.  In our pizza analogy, it means taking a walk down the frozen food aisle and doing a “Take and Bake”.


To jump start your journey to the cloud, why is this a no brainer?  When it comes to designing and provisioning apps like Exchange, SharePoint and SQL, the ingredients are virtually the same for every environment, only the scale, or in our pizza example the number of slices change. The benefit of vendors assembling, testing and certifying an integrated solution instead of your internal IT department delivers various business benefits including increased agility though accelerating deployment and driving down operational costs from using more software automation.


On the other side, there is no doubt public cloud (the “Dining Out” model) is experiencing greater levels of adoption.  Complete extrication of infrastructure ownership and elastic pay-per-use pricing models makes the CIO look like a magician to the CFO. But while public cloud can make a great choice for certain workloads and data types, savvy CIO’s have come to realise public clouds alone will not meet all their requirements today.


This begs the question, why can’t I get the best of both?  Well the answer is you should, and you can, it’s called a Managed Hosted Private Cloud.


In the scheme of cloud terminology, it’s a mouthful I know, but it takes the agility and outsourcing attributes of public cloud and couples it with privacy, control and configuration customisation benefits of private cloud.  For the business, it allows you to meet sovereignty, governance and compliance requirements with greater surety.  For IT, it places you firmly in a position to outsource your infrastructure operations over time.  Back to our pizza analogy: it’s the “Pizza Delivered” option where the oven (Data Center) and gas/electric(Power/Cooling) and pizzaiolo (Experts) are thrown in.


Here are four reasons why building infrastructure this way makes sense;


It’s all about workload agility

Public clouds have not only helped keep IT budgets in check by moving transient workloads like like UAT and iterative ones like  DevOps onto lower cost resources, but it has enabled applications to be more “fluid” and rapidly evolve due to the relative ease in which instances can be spun up, spun down and destroyed.  Some applications also benefit from the regions and availability zones regions, especially web scale applications which connect thousands of millions of end users around the world.  Regions together with Content Delivery Networks in the cloud paradigm enable cloud providers to have fast response times to all of their customers regardless of their location.  These workloads have made public cloud not only popular, but a more sensible choice to deploy applications.  However, the reality is todays businesses run a patchwork of traditional applications, where releases are much more stable and performance is predictable. Furthermore, in a number of industry sectors, like banking, country based regulations strictly prohibit customer information and processing to leave the institution, relegating many core banking applications to stay on dedicated infrastructure.  This is where private cloud infrastructure cannot only provide the same agility as public cloud, but at a more sustainable cost point too.


Doing “Real” Cloud

If you ever bump into a CFO or procurement officer at a bar, ask them if they know their true cost of running IT?  I’m betting nine out of 10  would have no idea.


It’s not their fault; the fact is there are so many components beyond the procurement of infrastructure that most simply can’t do it.  Things like data center floor space, power, security and manpower are just difficult to account for.  And managing leases, where upgrades and terminus conditions are a financial nightmare.    A managed hosted (off-premises) private cloud not only allows track all of our resource utilisation at a more granular level thanks to the modern orchestration software, but significantly simplifies the commercial model by bundling all these components in a utility or pay-as-you-go schedule.


Addressing the talent deficit

Although IT skills have been accelerating over the past two decades, adapting the curriculum to keep pace with technology changes is a constant challenge for tertiary institutions.  In Asia Pacific, when it comes to cloud skills, according to this recent survey only one third of learners admit to having expert skills.  With such a deficit in skills, businesses employing in house expert skills like Cloud architects and practitioners will need to be prepared that these people won’t come cheap.  Worst case, the people costs could outweigh the savings in infrastructure.


A managed cloud solution has the potential to offer multiple benefits.  By leveraging specially skilled personnel, you not only reduce the training and development budget, but have the opportunity to employ the best practices of solution providers.


Step to the Hybrid Cloud

According to Gartner, nearly half of big companies will have hybrid cloud deployments by the end of 2017.  And a recent survey by ESG on private cloud trends found behind elasticity of resources, the second most sought after characteristic of a private cloud was universal access to public cloud resources. Smart IT shops are no longer looking at building application infrastructure on in house silos, going back to workloads they are working out which attributes they need in a business application and placing its components on the right fit cloud platform. For example, although the processing of a core ERP application may best sit on a private cloud, archiving and data protection could go out to public cloud thereby realising huge efficiencies in management and cost.


Last week, Hitachi Data Systems made some important announcements helping customers to realise this increasingly popular mode of IT.


By supporting VMware’s latest vCenter stack to burst to vCloud Air, and Microsoft Windows 2012 Azure pack, we not only leverage the seamless management between private and public clouds, but support deployment of both on the same physical infrastructure.


Understanding organisations have different financial objectives, we introduced three flexible commercial models to consume cloud; traditional capex purchase, pay-per-use IaaS, and FlexBuy which is a blend of both.


And finally we partnered with Equinix, a global data center provider with a presence in 32 major business hubs around the world, to compliment our existing local data centre providers.  Situating your private cloud in a world class facility and wrapping the data center costs not only reduces upfront investment, but leveraging Cloud Exchange to interconnect to other commercial clouds reduces ongoing costs too.  As the preferred data center for some of the biggest public cloud providers like Amazon and Microsoft, having a direct, secure link provides a predictable SLA and reduce the transmission costs by avoiding public network costs.  Having all these public providers in your neighborhood not only means you get better economics, but also a way to migrate between them when new services become available.


While moving to managed hosted private cloud involves multiple decisions with various stakeholders in the business, it does help strike a balance between control, cost and agility that businesses need to thrive today.  Hitachi Data Systems, through its intimate understanding of enterprise IT needs and vast ecosystem of partnerships is helping make this decision easier to digest.


So, hungry anyone?

I recently had the privilege of spending two days with 30 CIO’s from various South East Asian countries in the Chinese entertainment mecca, Macau.  I have to say I found the rapidly evolving perspectives toward Cloud were just as impressive as Macau’s changing skyline!


Only one year ago, many of these CIO’s were using public cloud services exclusively for application development and testing purposes. Now a good number of them are running select applications with production workloads.  So why the change?


The Mobile Infringement Notices trial run by the New South Wales Police in Australia provides a hint.  This project demonstrates that even for applications that deal with highly sensitive personal information, like vehicle registration and license details, public cloud can be useful in parts of the workload.  In this instance delivering middleware processing with scale, without compromising government regulations.


Although the majority of enterprise cloud workloads today are done in either on-premise or managed private cloud, things are changing quickly.  The nexus forces of greater competition amongst the public cloud (as seen by recent Cloud Storage Wars) together with cheaper WAN bandwidth in metropolitan areas are making Hybrid Cloud a more cost effective way to platform applications. Even our very own Chief Economist, David Merill recently blogged we have reached a cross-over point where the economics just make sense.


But another macro trend propelling this rise in Hybrid Cloud adoption is mobility. Although many people associate mobility with app delivery through smartphones and tablets, Hu Yoshida describes this more broadly as a transformative way in which we work.  Beyond endpoint devices, mobility encompasses the data within datacenters - on premise, off premise, across metropolitan boundaries or even continental borders.


Hybrid Cloud is gaining momentum, with some surveys suggesting that 70% of organizations are either using or evaluating it and analysts predicting it will represent as much as 30% of workloads in the next 4 years.  To truly reap the benefits of combining the best of Private and Public Cloud, it's vital to tackle a couple of important attributes.


A recent blog by Colm Keegan at Storage Swiss highlighted four obstacles to Cloud Storage adoption which really echo what I have heard from the CIO’s.  So in this blog, I would talk about how Hitachi Data Systems and our recent enhancements to our Content and Mobility portfolio helps address them;


  1. Integration. The fact is most legacy applications don’t talk the cloud language (like REST and S3) making most public cloud services difficult to consume.  As Colm rightly points out, the process of re-coding legacy applications is not for the faint hearted, it can be resource consuming not to mention expensive proposition. Hitachi Content Platform (HCP) Version 7 now supports adaptive tiering to a number of public cloud services like Google, Amazon, Azure. By presenting standard CIFS and NFS interfaces, HCP acts as a seamless gateway between legacy applications and your cloud based repository.
  2. Performance.  Although the instant provisioning and bottomless capacity attributes of public cloud services remove a big operational burden for administrators, they come at the expense of high latency.  The performance penalty for reads and writes renders it unusable for applications which demand high performance.  However many of these applications only need a small subset of data for short periods of time.  Therefore technologies that employ clever caching algorithms which shuffle data between a local device, like Hitachi’s Data Ingestor (HDI) and a cloud storage repository like HCP over a WAN gives you the best of both worlds.
  3. Security. Despite overcoming many waves of resistance over the years, one that still remains in pure public clouds is the question of security.  With horror stories like those experienced by Code Spaces a few weeks ago, CIO’s quite rightly question the risks of putting certain workloads into open environments. With HCP used in a hybrid cloud topology, sensitive data sets can remain on-site while less important data can be archived to the public cloud, all whilst being managed by a consistent set of tools and policies.  Another benefit is having full auditability of access, something that is not offered by public cloud storage services.  Furthermore, encryption can be managed in-house within HCP instead of the public cloud service provider, giving you an extra level of protection. And when it comes to file sync and share, HCP Anywhere delivers a 100% on-premise solution with the latest version 1.2 including additional security features like ICAP support, MDM, single sign-on, link management and de-registration.  But don’t take my word for it, in February Gartner scored HCP #1 in the Security category of its Critical Capabilities.
  4. Cloud Lock In.  In the world of traditional IT procurement, organisations protected themselves against lock in by purchasing products and solutions which worked to standards and created an environment for vendors to compete for their business.  It's only natural to want the same when investing in cloud services.  Another genuine concern amongst CIO’s is once I put my data in, how can I get it out?  The collapse of cloud storage specialist Nirvanix sent shockwaves through the industry last year giving customers only two weeks to move out their data.  Although this is an extreme case, the fact is you may want to move data because of lots of different reasons – cost, latency, regulation, sovereignty or just better service from another supplier.  With the latest version of HCP, you can use adaptive tiering to seamlessly migrate data objects between service providers all without disruption and maintaining the balance of choice.


Here in Asia Pacific, the public cloud landscape is becoming increasingly competitive many players established up in major locations around the region.  With vendors investing heavily into integration of their private cloud solutions, the popularity of Hybrid Cloud is set to jump.


So what plans do you have to use Hybrid Cloud, and what workloads do you think are best suited to this environment?

For those of us working in IT, I’m afraid I’ve got some bad news, our business masters are not happy with us.  According to a recent survey from McKinsey, just over half of all executives say their CIOs have a significant impact on their organizations’ business issues.  Ouch!


They also said they want to spend on average 8% less on infrastructure, but 6% more in innovation. This is all well and good for our Veep’s to say, but how do we IT practitioners actually achieve this?


Before we go beating ourselves up, it’s worthwhile understanding how we got here.


What the (almost) great depression taught us

The Global Financial Crisis of late 2000’s not only sent many people to the wall and Governments scrambling for economic stability, it forced many businesses to take a good hard look at their costs and figure out new ways to become efficient.  As consumer demand slumped to an all time low, competitive forces became fiercer than ever as survival instincts kicked in. With IT being seen as a high cost center, it was inevitable that the bean counters went knocking on their door first to look for savings.


And probably with good reason too! I mean how do you explain to an accountant why all the storage you purchased a year ago is only 30% utilized when costs continually go down? Or how do you justify why software to help you streamline operations remains uninstalled because you were too busy keeping the lights on?


They will ask you why can’t we just pay for what we need, when we need it? And if we don’t need it anymore, just hand it back?  The fact is IT needs to align to market economics, not technology economics if its is to remain sustainable.


Expectations have changed forever

In an internet driven, globalized economy the production of goods and delivery of services is won and lost on realizing economies of scale.  The differentiator for customers is no longer the product itself, but speed and convenience they can get it.  Customers have more choice than ever before, so their expectations of service have not only gone up, but changed all together.


Think about the last time you wanted to buy something, how did you do your research?  Did you jump in your car, drive down to the shopping mall and walk the aisles?  Did you get out the yellow pages and start ringing stores in your area?  The answer for most of us is no, we took our smart phones or notebooks, punched in what we were looking for into our favorite search engine and commenced our journey of discovery.  Chances are you didn’t even purchase it from your own neighbourhood or country, rather somewhere overseas where you could get it conveniently delivered to your door.


And the most likely reason your chose that supplier is they made it really easy for you to purchase from them. Maybe they allowed you to browse details of the product and pay from a mobile device, or helped you make your decision by hearing what other customers thought of the product and what else they purchased.


Delivering the mobility enabled business, anywhere, anytime and on any device is necessarily trivial, but is now the means by which companies now need to compete our hearts and wallets.


IT is no longer black art and changing the game… again

I remember when walking into a board room to deliver a IT strategy presentation to line of business managers felt like a meeting between two alien races. Either their eyes would glaze over before I got to the second slide or they thought WINDOWS stood for Wished I‘d Never Deployed On Work Station.


Speaking different languages is one thing, but having misaligned priorities made synergies between functions really tough.  According to this study last year, only 10% of top marketing and IT executives believe collaboration between their corporate functions is sufficient.


Thankfully the discussion in the boardroom these days is changing, and the C-Suite are becoming much more comfortable with IT.  Decades of exposure to implementing systems like ERP (Enterprise Resource Management) together with the consumerization of IT through the internet and mobile devices has not only made them more knowledgeable, but empowered to envision its possibilities. Sharing information between departments, redefining business processes and reaching new customers and is now easier than ever before.


Of course all of this does not mean the role of the CIO is redundant, far from it!  Greg Baster, CIO of property group GPT, summarized it perfectly when he said “it’s not a change in role, it’s a change in emphasis”. With the line of business leaders now stakeholders shaping the technology strategy, IT is a business partner, as opposed to just a naysaying support function.


We are already seeing the outcome of such collaborations, in fact they are causing industry wide disruptions.  Take for example Yu'E Bao, new kind of financial institution backed by e-commerce giant, Alibaba.  They offers the convenience of internet enabled on-demand deposits offering higher wealth management returns using technology-enabled insight through advanced analytics of its Alipay accounts.  By predicting when users will move money in and out of their accounts, they are able to better manage liquidity risks and offer its customers almost twice the interest rate of traditional banks.  In less than two years, they have managed to attract over 81 million customers to this new model and really shaking up traditional markets in China.


A new imperative for IT

These economic evolutions, along with nexus forces of the cloud, mobility and insights through analytics have changed the landscape for IT practitioners forever.  The imperative to let business drive IT and not the other way around is now apparent if businesses are to not only survive, but thrive in the brave new world.  To lead this, we need to envision a new way to build IT, we need Business Defined IT approach.


I know it sounds like another “markitecture” buzzword, but a Business-Defined IT strategy calls for superior capabilities for delivering services faster and continuously.  Sure the infrastructure that support these applications needs to continue to be reliable and scalable, but moreover it needs to be responsive to changes and future-proof to adapt to the unknown.  In other words, it needs to be software-defined to make new functionality available quicker, highly automated to bring value instantly, non-disruptive to service customers 24/7, extensible to deal with hyper growth and virtualized to meet changing demands.


This week, Hitachi Data Systems is announcing the first important step to this new imperative; we call it Continuous Cloud Infrastructure.  You will hear a lot about how we are bridging yesterday’s legacy data center to tomorrows mobilized IT-as-a-service.


In the next few series of blogs, my colleagues and I will take you on walk, drive and flight across the world to show you how Business Defined IT is already happening now, and how you can harness its power in your business.


But we also want to hear your perspectives and the initiatives going on in your business, so please share them with us here.

In the 1980s, the phrase “Asian Century” was coined to describe the region’s anticipated economic prosperity in 21st century. Thirty years on, it would seem that the vision is quickly becoming a reality.


Japan, for one, has had a long history of challenging the West and building competitive international businesses in sectors such as automotive, industrial machinery and power generation. Over the past 20 years, South Korea has also emerged as an international innovator with brands such as LG and Samsung becoming household names in home appliances and consumer electronics throughout the world. The Southeast Asian Tiger countries have some of the fastest growing economies in the world, and China just surpassed the United States as the world's largest trading nation.


But can Asia also establish itself as a powerhouse in the digital arena? If so what are the key factors that will enable the region to create the digital equivalent of the Asian Century?


Perhaps we can borrow an insight from Herbert Hoover, the 31st President of the United States, who once said “Competition is not only the basis of protection to the consumer, but is the incentive to progress.”


The spirit of competition is a core ingredient if Asia is to succeed in the digital age. And the region certainly has that in plentiful supply.


Big aspirations


Asia’s big population is matched only by its big dreams. As the digital economy ripples across the globe, aspirations to succeed in this race are also going viral throughout this rapidly maturing continent.


In India, the rise of the IT sector has been simply astounding. In the past 15 years, its contribution to national Growth Domestic Product (GDP) grew from 1.2 percent to 7.5 percent. Companies such as InfoSys, Wipro, and Tata Consulting Services, who all started out providing low-cost IT operations outsourcing services to overseas companies, are now transforming themselves to deliver more lucrative and higher value services such as consulting and Business Process Optimisation (BPO).


China, too, is making its mark on the digital stage. Just as 2013 was coming to a close, e-commerce behemoth Alibaba is poised to challenge the likes of Amazon, Google and Microsoft in the cloud space, announcing plans to expand its cloud services outside of domestic borders, with rumored locations being U.S. and Southeast Asia.  They also recently achieved the world's first gold certification for cloud security from the British Standards Institute, cementing its desire to play on the world stage.


Show me the money!


Do these digital aspirations have people reaching for their wallets to invest? It would seem so.


We are seeing major dollars and rupiah's going into one of the core building blocks of the digital ecosystem – startups. Asia is fast becoming a hotbed of tech funding activity. According to Internet DealBook, Asia Pacific had the highest average deal value for technology investments and acquisitions compared to North America and Europe in Q3 2013. This demonstrates the value of innovation coming out of the region.


Of course it takes more than the confluence of visionaries and cash to ensure success in this brave new digital world; they need to be connected for the real magic to happen.


In places such as Silicon Valley and New England of the United States, there are well-established ecosystems that not only create a healthy competition for funds, but also provide collaboration opportunities between startups and VCs — and amongst the startups themselves — to accelerate innovation and move to the next stage of growth.


It has proven incredibly difficult to replicate the success of Silicon Valley, but signs show that things are moving in the right direction in Asia. Thanks to entrepreneurs offering mentorship and as well as seed money to accelerate the commercialization of their prototypes, tech startups in the region are finding it easier to survive beyond early-stage incubation.  Today, we see a host startup accelerators springing up around the region. Examples include Singapore’s JDFI.Asia, South Korea’s SparkLabs, Hong Kong’s AcceleratorHK and India’s TLabs.


Another citical area that Asia is investing heavily in is infrastructure. Take data centers for example. Digital ecosystems and economies generate vast amounts of data, which will have to be stored and crunched somewhere. This has led to a surge in the building of data centers across the region. According to a report by Data Center Dynamics, China and India saw the highest growth in data center investments in the past year, with rates of 19 and 12 percent respectively.


Connectivity is another important infrastructure component of the digital world, especially with cloud services becoming the preferred paradigm for IT consumption. As more data and workloads converge into mega data centers, fast, reliable and adaptable connectivity will become key. This is where we see new players like Megaport are changing the game by ditching the traditional fixed-contract models of telcos and offering instead flexible, demand-based, scalable-capacity provisioning. With plans to expand into the Asia Pacific region this year, Megaport is promising connectivity services of up to 100 Gbps across the region, injecting significant capacity as well as  healthy competition into the established telco environment.


But the most critical factor that underpins any successful pursuit in any arena – is people. To successfully create the digital equivalent of the Asian Century, the region needs to invest in the skillsets of the future. This can be a real showstopper for a rapidly growing region like Asia, in fact a recent survey revealed 95 per cent of respondents were concerned the skills shortage has the potential to hamper the effective operation of their business.  Thanksfully this is another area where technology companies are investing. Cisco recently announced an initiative to create as many as 400,000 network professionals over the next five years.  As a company with its roots in this region, Hitachi understands the importance of partnering with countries to develop the high-end skills needed for the digital economy. In 2011, we established an IT R&D centre in Bangalore, and last year opened another in Singapore, pledging to have a total of 400 researchers by 2015.


The road ahead


Investments in ecosystems that foster innovation, the building of infrastructure and proliferation of high-end skills are just table stakes. Serious action by governments to tackle privacy and copyright issues are also crucial factors that will help determine which countries take pole positions in Asia’s digital race.


There is now little doubt that the region as a whole will play a prominent role on the global digital stage. The foundations are being set and some amazing things are happening as we speak.


So what do all these developments have to do with business leaders who are busy running their enterprises and trying to find innovative new ways to grow their market share? What does the tussle between countries to be digital pioneers, and the region’s overall aspirations to be a digital powerhouse, have to do with you?


If you think about it, your business is a microcosm of what is happening out there on the international stage. The revolutionary impact of the digital wave, the opportunities that it engenders, and the race to seize these opportunities for competitive advantage are now apparent.


As a business leader, as it is for countries in the region, the imperative is to nurture innovation, strengthen infrastructure and invest in people in order to capitalize on the opportunities of the information age.


I would be interested to hear other stories around the region, as well as how you or your company is contributing to Asia's digital ascendency?