Skip navigation
1 2 3 Previous Next

Innovation Center

94 posts

Let’s face it, if you’re a financial services institution, your main business objectives are to generate revenue, grow your business, and be competitive in the market landscape. However, it would appear that most firms are getting caught up in the ever-increasing cost and resource burdens associated with the dramatic increase in regulatory obligations.

 

With the advent of the fourth industrial revolution and the digital age, where technology is heavily influencing how we live and operate in our day to day lives, we will see a rise in new regulatory requirements that will inevitably come our way. Because of this, the financial services industry will see a significant change in how it operates today. The fundamental influence and driver behind this change is data.

 

In this digital age, oil and gas are getting old, and gold is looking rather brassy, too! Data is omniscient and the most precious commodity you can have. Everything in our day to day lives revolves around data: Collecting it, shaping it, cooking its, dicing it, slicing it, analysing it, monetizing it! It’s all about data, data, data!

Big Data is changing the world as we know it, and taking us away from our cozy little comfort zones of what we know, into an era of change, transformation, and in some cases, uncertainty. And dare I be totally cliché and steal the adage from a popular SyFy show and say that data is taking us “boldly [to] where no man has gone before,” and, let me tell you, these changes are remarkable, awe inspiring, and critical! The term “big data” -which by the way I loath! Oh mighty naming convention gods, can we not come up with a better word please!?!-in any case I digress, Big Data (yuck! I’m just going to call it BD for the sake of this blog and my sanity!) is very broad, so let’s break it down for the financial services industry before I go any further. BD is Financial Services is both unstructured and structured information, whether it’s sourced through in-house or third-party channels, for example, emails, scanned forms, messaging (text, voice, video and photo). The data can come from customer transactions, open data, and call centers, just to name a few.The data can be a huge advantage for firms because it provides insights on operations, culture, conduct, and client interaction.

 

So, why don’t you, dear reader, humor me, and allow me to take you on a brief journey to show you how I see “BD” impacting society and the changes looming on the horizon. And if you stick with me until the end, I will also look at how financial services institutions can leverage their data to comply with complex global regulations in this new digital age!

 

In the immortal words of Zager and Evans …. “In the year 2525, if man is still alive, if woman can survive, [you] may find….”

 

Improved Healthcare

 

Access to scientific data has revolutionized how the medical profession treat diseases and enables them to pioneer and develop new treatments and medications to better help patients. However, the ever expanding data verse, and the flow of new more complex data, makes it nigh on impossible for doctors, scientists, and large medical organizations to sift through, ingest, analyse, or even understand, what this data means. So, what’s the solution? We are seeing the emergence of powerful, smart machine learning and other artificial intelligence tools, which use sophisticated technology that enables them to hunt, track and look for the proverbial needle in a hay stack that would usually be lost in the in the hubbub of using more traditional statistical tools. Large data bases with digital imaging tools, for example, can serve as invaluable tools for diagnosing a growing number of conditions. Platforms run by BD won’t replace the medical profession imminently, however, you will see more medics empowered by more sophisticated technology to better service their patients’ needs.

 

New Jobs

 

I constantly get asked with the advancement of these new technologies powered by BD, “What will happen to our jobs?” Look! Let’s not get carried away here, until we reach the “singularity.” I think I am quite safe in saying that we are still some way off from the day where we will be answering to “Skynet” and “Terminators.” Until that days heralds forth, truth is that the need for that true intellectual human capital will never go away!  You will always need humans to monitor, supervise, and understand what that technology is doing. Although, in saying this, how we do our day to day jobs will change dramatically.  We will see the emergence of new jobs being created for those with expertise in BD and its associated technologies. Look at the insatiable thirst we are seeing on the market right now for data scientists, in particular. This trend is only set to grow and quicken in pace, and new roles and jobs will also follow suit. “BD requires software developers to implement algorithms developed by data scientists and mathematicians; IT experts are also needed to ensure the increasingly complex computer systems used to run programs are reliable. More high-paying jobs will be available to those who qualify, and students and employees are beginning to change their focus.”

 

City Management

 

How about running a city? It can become an insurmountable nightmare. Things such as complex traffics routes, congestion, public transport inefficiencies, and even something as simple as trying to get an elevator, can be a nightmare with long wait times. I remember working in one office in Canary Wharf on the 34thfloor. I had to give myself 10-15 minutes extra time just to get to meetings!

 

However, with data comes the ability solve these challenges. For example, my dream of wanting an elevator to detect my needs, wait for me, and safely speed me up straight to the right floor were actually not some romanticised fantasy, but an actuality that has already been realised by Hitachi.  How about another example? Ok, so what about those ghastly public transport delays?!

 

There is always a risk of delays bringing virtually everything to a standstill because of the lengths and complexity of rail routes, and believe me I am no stranger to being stuck in a crowded tube for many hours with my personal space totally violated and my head stuck up some lovely strangers armpit for the duration of the wait. Or having someone rest their newspaper on my head (yes, that has actually happened, and yes, I’m way too short for public transport! But that’s another story for another day). But again, through the use of data we have been able to address this issue by evolving a system that keeps timetables stable for even the most complex networks. The system monitors train operations in real time and delivers up-to-date information, including the fastest route when multiple route options are available, to passengers if schedule issues arise, and this is all powered by data.

 

What I’m trying to in-eloquently state is that data will enable city management, to look and sift through the information in order to establish more cost effective, efficient, and automated ways in running their cities before investing an arm and a leg on costly projects!

 

 

Better Human-Computer Interaction

 

How many of you use an automated telephone banking, payment systems?

 

Phone banking system:“Hello, welcome to our automated payment system. Say 1 to check your balance, 2 to pay your bill, 3 to report a card lost or stolen, 4 to learn about our new products, 5 to repeat the menu, 6 to go back to the previous menu, or 7 for anything else.”

 

Me:[clearly and audibly] 7

 

Phone banking system:ok great, 7!, Before I can get you to the right department, can you tell me in a few words what you’re calling about?

 

Me:Discuss my account

 

Phone banking system:Ok great! Tell me in a few words, what you need to discuss?

 

Me:My statement does not look right

 

Phone banking system:I’m sorry I didn’t quite get that! Tell me in a few words, what you need to discuss?

 

Me:My statement does not –

 

Phone banking system(cutting over me): I’m sorry I didn’t get that! Tell me in a few words, what you need to –

 

Me (getting irritated): Put me through to an operator

 

Phone banking system:I’m sorry I didn’t get that!

 

Me (nostrils flaring in anger):Put me through to a customer Service Representative

 

Phone banking system: I’m sorry I didn’t get that! Tell me in a few words-

 

Me (indignant with rage, frustration and ready to pummel my phone): Put me through to a freaking *%$#$ customer %^** Bleep representative NOWWWWWW!!!!!!!!!!!!

 

I think it’s safe to say that 90% of people use these systems to do their day-to-day banking, check bills, balances, etc. And I bet like me, you are no stranger at yelling down the phone at some robotic voice enhanced with a soft accent to give it a human element (my AMEX system has a nice sounding Scottish gentleman), no matter how friendly the robots voice, long menu options, and inaccurate voice recognition technology can make this experience a nightmare for users, and almost impossible for them to conduct their business quickly and efficiently. Well, here is where efficient use of data can help, coupled with the right technology such as machine learning (ML) or artificial intelligence (AI), providers can deliver faster, more accurate and reliable interactions, and overall, a much more satisfying user experience. These type of tools can learn over time and can provide quicker, swifter responses and help to reduce time, and cost, whilst delivering excellent customer service capabilities. We have seen these voice recognition technologies, such as Siri, Alexa, etc., where human-computer interactions are now becoming more fluid and efficient. Making them even more efficient than human run processes (off shore call centres with pre-prepared scripts come to mind! (slaps own forehead in despair).

 

Securing Technology

 

As I have been harping on and on about in the previous paragraphs, we are in a world of change, and technology is leading the way whether we like it or not! How many functions do you carry out on your smart phone alone? Internet banking, pay bills, shop on line, order food… the list goes on and on! How many of you can actually activate voice command in your smart car? Or how about your Fitbit or Apple Watch? I remember the day I got my first Apple Watch. As a kid I used to love the Back to the Future movies, and wondered if I would ever see the day where I too could talk to someone through the watch on my wrist like Marty McFly. I can honestly say the day Apple released their 1stgeneration watch was one of the best days of my life, to be able to realize a childish whimsy and see it come true!

 

What I’m trying to say here is that we are now becoming increasingly reliant on technology and the role technology plays in our lives is very significant. Whilst this all comes with tremendous advantages in terms of convenience and efficiency, this advancement of technology also comes with increased risks. You see, this same sophisticated technology can also be harnessed negatively, and we are seeing a rise in sophisticated cybercrimes, such as data breaches, hacking, and identity fraud. These cybercrimes are very difficult to predict, pre-empt, or detect until it’s too late. However, it’s safe to say that the good guys do have a fighting chance in winning. Data, when combined with AI, can be a powerful tool in the fight against cybercrime. Through analysis, patterns can easily be detected, and the source of malicious attacks can be quickly found and addressed. Vulnerabilities can be spotted and corrected, which can allow organizations to better mitigate risks and prevent breaches or attacks.

 


How Financial Services institutions can Leverage their Data to Comply with Complex Global Regulations

 

As a result of the financial services collapse, and the rapid onslaught of everchanging regulations, we have seen that the financial services industry has evolved dramatically. In terms of operations and service delivery alone, we can see great improvements within organizations. However, what have seen is that most organizations fail to leverage the information within their own databases. Heavy reliance on behemoth brittle legacy systems, data living in silos, lack of accountability and good robust data governance, and not knowing where data lies,  are all key contributing factors to this.  Even good intentioned creations of so called “data lakes” have turned into fly infested stinky “data swamps,” where you can’t tell the muck from the mire!

 

However, my dear reader, I can safely say that this is certainly all about to change dramatically within the financial services industry. The banking sector alone processes immense volumes of data which is created and collected. Some industries predict an exponential growth in the volume of data before 2020. “The big D” is a huge step towards the development of the financial services industry, and will propel it out of the dark ages into the new millennia! Let’s look at how BD will improve and enable the FS industry:

 

Detailed Progress Evaluation

 

Service Delivery improvements

 

The Pros of the  Big D for FS

  1. 1. Fraud Detection & Prevention
  2. 2. Enhanced Compliance Reporting

 

Customer Segmentation

 

Personalized Product Offerings

 

Risk Management

 

How Hitachi Vantara Can Help

With data volumes rising, many of the largest financial institutions are looking to Hitachi Vantara for actionable insights into customers, operations, and risks. Learn more about Hitachi Vantara’s banking and financial services solutions here.

 

Further Reading:

https://hitachivantara.com/en-us/pdf/analyst-content/strategies-for-weathering-innovation-storm-in-communications-governance-e-discovery.pdf

 

http://social-innovation.hitachi/us/case_studies/elevator_china/index.html

 

 

This week, Hitachi Vantara is joining long-time strategic partner Cisco at its annual Partner Summit in Las Vegas, #CiscoPS18. The two companies have worked together for decades, continuously producing innovations that are making data center modernization achievable for a wide range of businesses.

 

Market-leading networking solutions from Cisco, combined with best-in-class data storage and data center management technologies from Hitachi, have been steadily revolutionizing the way companies store, access and share data.

 

The goal of the Hitachi-Cisco collaboration is straightforward: As data centers become ever more complex, the two companies partner to make it easier, more efficient and more cost-effective to minimize this complexity. By reducing the total cost of data center and network ownership, while supporting strategic performance gains, collectively, the companies have helped thousands of organizations maximize the return on their IT investments and achieve business objectives by delivering innovative solutions.

 

Creating Outcomes That Matter

There’s no doubt that data plays an essential role in today’s highly digital business environment. Every day, companies gather insights about their customers, trading partners, suppliers, internal and external processes, market trends and other key strategic topics. This data represents an organization’s most important asset. Yet it can be hard to manage, share, protect, analyze and capitalize on these insights.

 

Hitachi and Cisco have responded to this challenge by increasing the ability of every business to access and leverage world-class data management and networking solutions. In everything they do, these two companies have focused on creating practical, achievable benefits — and outcomes that genuinely matter from a strategic perspective.

 

Speed Plus Intelligence

For example, recent innovations from Hitachi and Cisco support new levels of automation and agility for data centers of all sizes. By combining artificial intelligence and machine learning capabilities from Hitachi with the scale and speed of Cisco’s networking solutions, data center operations can now be synchronized and automated at a game-changing level.

 

By leveraging real-time analytics and rapid root-cause identification, companies can enjoy uninterrupted data availability, increased application performance and highly efficient daily operations. And, by increasing predictability, they can also significantly reduce their exposure to the risk of downtime and data losses.

 

Not only has this joint effort brought enhanced capabilities and best practices to a new set of companies, both large and small — but this collaboration has also delivered innovations to new markets around the world. The two companies have worked together to serve more than 10,000 mutual customers globally.

 

“The pace of this highly-competitive marketplace requires efficient operations, the ability to access and analyze data insights, and business agility and responsiveness. To achieve this, organizations want to work with experts, partners who have the proven expertise and solutions to deliver their desired business outcomes, Hitachi Vantara and Cisco can be those partners,” said Mike Walkey, Senior Vice President, Strategic Partners & Alliances at Hitachi. “Leveraging our joint experience, our expanded collaboration with Cisco will continue to usher in a new era of data center modernization for organizations regardless of industry, geography, or size. We’re excited about what the future will bring.”

 

“It’s rare to see a partnership thrive across decades,” added Mike Austin, Senior Director, Global Industry Partners at Cisco. “I believe our relationship has proven successful because both companies believe data center modernization can be leveraged as a competitive advantage. By working smarter, simpler and faster, companies of all sizes and types can thrive in today’s complex global business landscape.”

 

What’s Next? Stay Tuned

Hitachi and Cisco have already partnered to produce innovations in data center modernization, data governance and data-driven insights. These joint developments have proven critical in ushering in today’s era of digital transformation — and increasing the speed and intelligence of everyday IT operations.

 

What’s on the horizon for these two strategic partners? Stay tuned for new innovations in 2019 and beyond as Hitachi and Cisco look to accelerate their strategic engagement. While the past results of this collaboration have been impressive, there is still much more to come. Watch this space.

Traditionally Fiber channel networks have been considered more secure than Ethernet networks. This is owing to the fact that it is hard to snoop traffic on a fiber channel network since one needs a tap to analyze traffic on a fiber channel network.

 

Basic security in fiber channel networks starts with zoning. Essentially every server that has access to data on an enterprise storage array is populated with two or more fiber channel HBAs. These HBA’s send and receive data from the arrays and are called initiators in the industry parlance. Each array appears as a target for the initiators and virtual disk drives called luns are made available via these targets to the initiators.

 

Zoning is a basic mechanism whereby one manually specifies which initiator can communicate with which target port. Each enterprise array has multiple target ports to which luns are published.

 

When an initiator and a target are connected to a fiber channel fabric switch they perform a fabric login called a FLOGI. On successful login, they register their addresses called WWNs with the Name Server running on the fiber channel switch. Each WWN is unique to an initiator or target and is assigned by the manufacturer much like a mac address is assigned to a network card.

 

As part of the deployment process, a storage administrator configures what is known as zoning whereby he or she creates typically binary sets of initiator-target pairs that may overlap. Once configured the initiator cannot communicate with targets and associated luns outside the binary set.

 

Now consider a typical server in the datacenter hosting up to 100 virtual disk drives called luns.

50 of these luns may be dedicated to an oracle database server and the remaining 50 may be exported via NFS to clients.  Consider a scenario where a trojan ransomware infiltrates the NFS share and thus gains access to the other 50 luns where Oracle data is stored. It could potentially take over the Oracle database files and lock out the process by encrypting the files.

 

Now consider a traditional scenario where the zoning mechanism plays out as below.

 

  1. Initiators and Targets login to the Fiber channel name server via a Fabric Login or FLOGI
  2. Zoning is configured allowing specific initiators to talk to specific targets
  3. On zoning, a state change notification is triggered whereby an initiator that is in the same zone as the target logs in into the target via a PLOGI
  4. Next, a Process Login or PRLI is initiated whereby communication via the FC4 layer typically SCSI can occur between the initiator-target pair

 

Now, what if an application layer login to the nameserver were introduced after the PLOGI and prior to the PRLI. Each application could be assigned a GUID and an administrator could specify which application can talk to which luns published on the target.

 

One could take this further with the nameserver on the switch acting as a key exchange server and forcing the application and the target to authenticate using a symmetric or asymmetric key exchange.

 

The only weak link here is the case where a rogue process manages to spoof GUIDs of authentic processes. This can be potentially worked around by making the GUID partly dependent on a binary hash of the process executable.

A client recently asked me if I could develop a report which analyzes the primary and secondary area of their business. I told her if she could elaborate the meanings of "primary" and "secondary" markets, I would be glad to help. She and her analysis team went away and discussed; after a few days the team still could not come to a consensus. Simply, one can define what constitutes primary and secondary markets one way and another person in another way.

 

I understood the client’s vision and the difficulty that she was facing in articulating the exact specifications of primary and secondary market. But the lack of definitions gave me an opportunity to be creative. And I had to consider that clients typically prefer tools that are intuitive than most sophisticated tools that they do not understand.

 

Imagine a situation like this. A task that involves two participants. The first participant is a regular person and the second participant is a car technician. On a table there are two sets of the following, one set for each person: 100 nails, a wooden plank, a regular hammer, and a sophisticated powered hammer from the shop. Each participant is given 5 minutes to nail as many nails as they can on the wooden plank.

 

Given that the regular person has no experience with the power tool compared to a seasoned car technician who has an ample amount of experiences, which tool does each participant use for their tasks? Due to the time constraint, the regular guy probably went with the basic hammer, rather than trying to learn and use the sophisticated tool from the shop. Even though the powered hammer is the ideal tool for the task, any untrained person would choose a tool that he or she is more comfortable with especially in an environment with resource constraints.

 

In this spirit, while developing the report, I decided to introduce percentiles as a simple statistical tool that captures the client’s intent.

 

Percentile is a familiar statistical method that is often seen in, for example, ACT and SAT scores. The School of Public Health in Boston University defines percentile as “a value in the distribution that holds a specified percentage of the population below it.” If a student receives a test score in the 95th percentile, her score is above 95 percent of the population or is within top 5%. Simple enough.

 

By integrating percentiles into a map, the combination can capture the areas with high business activity. Activities can be abstracted to be any measurable volume. For the purpose of this article, the workbook linked to this article uses the technique which combines percentiles and maps using public data from the City of Chicago (Figure 1).

 

Fig. 1 - Food Inspection Density in Chicago and surrounding areas
fig1.JPG

 

The city’s food inspection data includes information on the status, types and locations of inspections conducted by the officials. The heatmap represents the volume of inspections. Percentiles are calculated over the distribution of inspections. Zip codes are used to aggregate the individual locations on the map. With this, you can see zip codes with more inspections are assigned to darker red colors and areas with sparse volumes are colored with shades of green.

 

Fig. 2 - Areas of top 1% inspections in ChicagoFig. 3 - Area of top 1% inspections did not pass
fig2.jpgfig3.jpg

 

Looking at Figure 2, by selecting Top 1% (which is 99th percentile) of the number of inspections, we can see that zip codes 60614 and 60647 have the highest number of inspections compared to other areas. We can take a step further and select the ‘Failed’ value under the Result dropdown (Figure 3). From this, we see that 60614 has the most food safety violations in the metropolitan area and represents top 1% of the distribution.

 

Fig. 4 - Inspections in EvanstonFig. 5 - Areas of top 1% inspections in Evanston
fig4.jpgfig5.jpg

 

Now instead of looking at the bigger market, let’s shift our focus to a suburb of Chicago and see how the technique responds to the change. In the City dropdown, I selected Evanston as seen in Figure 4. As expected, the market in Evanston is much smaller with only two zip codes compared to Chicago metropolitan area. In statistics verbiage, the population distribution shrunk in size. Population distribution is a distribution of inspections.

 

The change in the market and population distribution does not affect the core concept and functionality of percentiles. When the Top 1% is selected, as seen in Figure 5, we can see that only one of the two zip codes remains, 60201 with 7 inspections. Note that zip code 60614 which we have seen before with all values checked under the City dropdown no longer shows in the visualization, because 60614 is not part of Evanston.

 

The technique is able to retrieve relevant information before calculating the percentile distribution. Even though the size of the markets changed between the City of Chicago and suburban Evanston, the technique adapts to the change in the population distribution while retaining the concept of the top and bottom markets. The top 1% of food code violations in a large metropolitan city like Chicago as well as the top 1% of violations in smaller areas like Evanston are both alarming. The top 1% of Chicago is of more concern because the market is much bigger. But regardless of the size of the regions, the top 1% represents the primary area of concern.

 

The percentile technique provides quantitative descriptions to qualitative business concepts. At the beginning of the article, I shared a story where the client and her team struggled to articulate the meanings of primary and secondary markets, especially when the scope of the business changes depending on which subsets of their business are selected (Chicago metropolitan and Evanston). By using the report that integrated the percentile technique, one can elaborate what a person means by primary and secondary markets.

 

For example, a marketing manager proposes that for the upcoming quarter, her team plans to expand the business’s primary market by introducing a new internet campaign. If an executive asks her what she means by "primary market", she can direct to the specific percentile figure such as the area where top quarter (75th percentile) of sales originated. The technique enables users to have precise communications when formulating business strategies.

 

The technique illustrated is an example of a template model which can be adopted to provide insights to a wide range of business problems. In the food inspections report, we saw that the report adapts to a change of environment from a large city to a suburb when retrieving relevant top areas of inspections. The technique can be applied to capture where a local office is attracting its revenue or where the highest achieving students are coming from. The technique does not have to depend on a geographical map. We can abstract the notion of a map into its elemental structure, a mathematical space. Without going into technical details, this technique requires two things: a space like a geographical map and a population distribution where percentiles can be calculated.

 

One example that does not use geographical space is that we can map medical profiles of patients who have cancer. The former notion of the map can be replaced with medical profiles of each person. Specific areas, zip codes, from previous example can be replaced with features within medical profiles such as age groups, eating habits, genetic attributes, symptoms and previous diagnosis. The population distribution includes the volumes of each feature in the medical profiles. Instead of comparing zip codes in the food inspection example, this case explores which features contribute the most to breast cancer. For example, the model may show that genetic defect BRCA1 as the top 1% (or 99th percentile) feature across medical profiles of many breast cancer patients.

 

The technique is a model and not a solution to business problems. Because the model says that genetic mutation in BRCA1 is within the top 1% of features (i.e. most common among patients) for breast cancer does not necessarily mean that the mutation causes cancer. Milton Friedman, a prominent figure in economics, explains the scope of models in his essay "Positive Economics":

“abstract model and its ideal types but also of a set of rules…. The ideal types are not intended to be descriptive; they are designed to isolate the features that are crucial for a particular problem.” - Milton Friedman in "Positive Economics"

The strength of this technique is its ability to highlight the important elements of the business problem at hand: it could be the area where large portion of revenue is generated, the area of largest food code violations, or a leading feature that affects cancer. The technique has the ability to highlight specific features by simplifying the business problem and the environment with assumptions and rules. In the food inspection report, the bad neighborhood with food code violation is only determined by the volume of inspections. Further analysis is required to conclude definitely that indeed the identified area is plagued with kitchens infested with roaches. The model is effective in its ability to deliver a clear answer given the simplified environment. And the simplicity in the model allows both technical and business users to understand the scope of the model.

 

The article is the first part of a series where I will unveil techniques that I found useful in consulting in the data industry. While I find sophisticated techniques attractive, I believe choosing a parsimonious technique as a consultant is more prudent than a methodology that cannot be easily utilized by clients in their decision makings. The future parts of the series will continue to unveil specific examples and techniques that are aligned with the spirit of the story.

 

External Links:

Food Inspection Density Map of Chicago

Food Inspection Data

Posted by Nirvana Farhadi  Nov 20, 2017

 

 

Today heralds a historic day for us at Hitachi Vantara! We are extremely honored to be hosting and strategically collaborating with the FCA (Financial Conduct Authority) and the BoE (Bank of England), Grant Thornton, and other key Financial Services stake holders, in holding a two week TechSprint to explore the potential for model-driven machine-readable regulation.

 

This incredible two-week Tech Sprint, will be exploring how technology can provide solutions to the challenges firms face in implementing their regulatory reporting obligations. If successful, this opens up the possibility of a model driven and machine executable and readable, regulatory environment that could transform and fundamentally change how the financial services industry understands, interprets and then reports regulatory information!

for more information on the event please go to:

https://www.fca.org.uk/firms/our-work-programme/model-driven-machine-executable-regulatory-reporting

 

A big thank you to all participants and strategic collaborators who have joined us today on the opening of this event! Participants are listed below.

 

  • Financial Conduct Authority
  • Bank of England
  • Grant Thornton
  • Hitachi
  • HSBC
  • Credit Suisse
  • Santander
  • JWG
  • Linklaters
  • University College Cork
  • The Information Society Project at Yale Law School
  • Stanford University
  • Governor Software
  • Immuta
  • Lombard Risk
  • Model Drivers
  • Regnosys
  • Willis Towers Watson

#FCAsprint#HitachiVantara #BoE #GrantThornton #JWG #modeldrivers  #HSBC#CreditSuisse#santander

 

For any information, and before communicating on this event please ensure you contact:

Niv@hitachivantara.com

Scott NaceyMichael HayJames KorolusG-OTP HubFinancial Services CollaborationKen WoodKarl KohlmoosJason Beckett Francois Zimmermann

Containers have been fueling the full-stack development implementation in webscale companies for a number of years now. Financial services being risk adverse are typically slow to jump on new trends, but have started adopting containers at a very fast rate now that they are going mainstream.  If you’re a financial institution and you aren’t at least taking a serious look at microservices and containers…you might be behind already.

Case in point: FINRA (a non-profit that is authorized by Congress) secures over 3,700+ securities firms. They have to go through 37+Billion transactions per day with over 900 cases of insider trading a year, and over 1500+ companies getting disciplinary actions levied against them. http://www.finra.org/newsroom/statistics.

 

When you look at year over year, you start to get the picture that fines, fraud and insider trading are growing at as rapid a pace as data and technology change. The amount of data sources you need to traverse in a small amount of time is huge. Going through that many transactions a day (with around 2.5 cases happening each day) means that the queries to run through that much data can take hours unless you factor in containers and the ability to blend data sources quickly, and return results fast. It’s like a data-driven arms race.

 

SpaceX-768x512.jpeg

 

This is where containers can help and are already driving the financial services industry across regulatory, operational, and analytical areas. Here are a few areas where I think containers are most impactful:

  • Bare Metal – Customers are increasingly looking for bare metal options to quickly spin up and spin down containers and microservices. This helps in two ways.  One they get to reduce licensing fees for hypervisors and secondly the speed at which they can utilize the hardware is greater.  This buys them the economy of scale, and a good ROI with a software-defined data center (SDDC) and software-defined networking (SDN) being two large drivers of this trend.
  • Automation – I’m a huge fan of automation, and when it comes to digital platforms that need little to no human interaction banking and finance are no stranger to this. People are prone to error, where automation is only as fallible as its programming.  Traditionally there has been a lot of analysts tied to these banking and finance queries, and having to parse through large amounts of data. One example of automation is the fact that you no longer need to interface with a teller to go to your bank branch. Personal connections and customer interaction are quickly being replaced with the ability to open your mobile phone and transfer that money anywhere you want, pay that bill, or send money to your friends all with the click of a button. I can tell you what I spent, where I spend it, and what category it falls in within seconds. All of this without ever talking to a teller, or needing some fancy analyst.  Automation is the answer and it’s no different with containers.
  • New Regulations – Governments always want to know where, who, and how that money moves. Compliance and fraud are at an all-time high. Just look no further than the Bangladesh bank where over 80 million dollars was stolen by hackers to realize this is a serious concern and could have been worse. https://www.reuters.com/article/us-usa-fed-bangladesh/bangladesh-bank-exposed-to-hackers-by-cheap-switches-no-firewall-police-idUSKCN0XI1UO.  Several hundred million and a misspelling of “Shalika Foundation” to “Shalika Fandation” saved Bangladesh bank from having potentially over 1 Billion dollars stolen. In this case, human hackers not automating helped, but far worse are the cybersecurity risks involved for the bank.  They can’t afford to miss any transactions happening anywhere they operate.
  • Cybersecurity – Financial security as noted above is a big real-time, data-driven operation that requires tactics and tools that are responsive, and can scale. This is again where container environments thrive. They can help identify and prevent things such as money laundering, intrusions like the one above unless the hackers misspell something and take the human element out of the picture.  Cybersecurity threats are on the rise, and it takes nothing more than not keeping up with the latest security patches to have a big impact once they get into your environment.  Target, Visa, Sony, Equifax – and their customers - have all learned what can happen with a breach.
  • Scale of transactions – As with the FINRA example above, as we get increased access to our money, with more ability to move that money quickly, financial institutions need to keep up.  With data growing 10x, and unstructured data growth at 100x, the need to parse through the transactions quickly is becoming ever more challenging.  Containers and scale-out micro-services architectures are the keys to solving this puzzle.

 

I can remember as a kid I had a paper register with how much money I had in it, and once a month or so I could take my allowance to Fifth-Third Bank and they would write my new total, and deposit my money. My mom would also keep her checkbook up to date, and it would have every transaction she ever did, from ATM to checks, religiously kept in it. I can’t tell you the last time I was in a bank, let alone kept a register log. They still send me one, but I think it’s still in the box somewhere with those checks I don’t use often unless forced to. Financial institutions now need to have all my transactions and have them accessible quickly. They need to watch for fraudulent transactions, where I am, how much I’m taking out a day, and what my normal spending pattern looks like to stop identity theft. Tough to do without heavy analytics in real time, even tougher without containers.

 

So what are the limitations of current systems?  Why not just keep doing what we’ve been doing?

 

There’s the old adage about doing things the way you always have and expecting a different result.  VM’s are like my old register and are well suited to those old monolithic applications. Not that there is anything wrong with the way I used to go to the teller to make transactions, it’s just clunky, slow and expensive. VM’s are the equivalent of the teller.  They aren’t responsive and they can’t meet the scale of modern distributed systems. Scale-up being the answer in the past (more CPU, more memory, more overhead, expensive licenses, and maintenance). go big or go home doesn’t work in today’s world. These dedicated clusters might work hard sometimes, but more times than not you’re scaling a large system up for those “key” times when you need them. With a highly scalable architecture, you’re able to scale up and down quickly based on your needs, without overbuying hardware that sits idle.  I won’t even touch on the benefits of cloud bursting, and being able to quickly scale into the cloud environment.

 

Secondly, integration for traditional architectures is difficult as you had to worry about multiple applications, and integration environments, drivers, hypervisors, and golden images just to get up and running.  How and where the data moved was second to just getting all the parts and pieces put together. Scale-out compostable container architectures that were designed to come together to address specific problems like data ingestion, processing, reactive, networking etc. (e.g. Kafka, Spark, Cassandra, Flink) solve the issues of complex integration.  These architectures are centered around scaling, tackling large data problems, and integrating with each other.

 

  So to answer the question whether financial services are ready for containers, the answer is undoubtedly yes. I would almost say they can’t survive without them.  Today’s dated VM systems aren’t ready to tackle the current problems, and they certainly don’t scale as well. In my next blog, I’ll go through some stacks and architectures that show how you can get significant results specifically for financial services.

 

Casey O'Mara

European banks will need to have Open Banking APIs in place by January 2018.

This whiteboard video explains how to enable your API platform and keep existing systems safe.

 

Implementing Open Banking APIs.png

Open banking APIs have become a financial services industry hot topic, thanks to two regulatory decisions.

The first is in the UK, where the Competition and Markets Authority (CMA) started investigating competition in retail banking. They produced a report last year which proposed several recommendations and requirements.

A principal recommendation was to improve competition in retail banking. To achieve this, the CMA decided traditional banks should expose their customer data to third parties that would deliver additional retail banking services.

In parallel with the CMA, the European Commission started its second review of the Payment Services Directive. That review also proposed that banks, with customer consent, should expose customer data to third parties, who could then potentially deliver superior services.

 

Four challenges of implementation

 

From talking to our existing banking customers, we have identified four challenges of introducing an open banking API.

The first is being compliant in time.  These are requirements from the CMA and a directive from the European Commission. The API need to be in place at the start of 2018, which leaves banks little time at this point.

Second is improving customer experience. Retail banks across Europe are increasingly focused on delivering new and improved customer experiences.

Third is competition. The principal aim of introducing open banking APIs is to allow other service providers to utilise the data, then offer new and improved services to retail banking customers.

Finally one that doesn’t come up very often, but we think is important, is the operational risk that building and exposing APIs places on traditional systems.

 

Typical existing core systems

 

No bank started life as they are today.  The majority have built up core systems over many years through mergers and acquisitions. Furthermore, they’ve delivered lots of different services over those years too.

Those systems as a result have become interlinked, inter-joined, and incredibly complex. They are traditional architectures and they scale up.

What I mean by scale up, is that if they run out of bandwidth to deliver new services, that is fixed by installing and implementing a bigger system, or a bigger storage device. Scale up systems are capital intensive and take time to become productive.

We should consider how existing systems are managed and changed. Due to the complexity, banks must make sure that those systems are reliable and secure. To achieve this, they wrap rigorous change control and management processes around the systems.  As a result, any major change, which exposing these APIs certainly is, equates to a substantial risk.

There is one other aspect that’s worth considering too. Banks know how many transactions existing core systems need to process.  By opening this API, that becomes unpredictable. The volume and shape of the transactions that those APIs will generate, is difficult to predict.

 

Database extension alternative

 

Instead of using existing core systems, our view is that most banks will build a database extension or caching layer. In this alternative when a customer consents and the bank exposes their data to third parties, banks will extract that data out of their existing core systems, transform it for the new style database, and then populate the database extension with the data.

This alternative provides several benefits. First, banks can quickly become compliant and provide open banking APIs. This solution will scale out, so as banks add more customers to this process, they can scale easily.

More importantly, expect forward thinking banks to use the API to add new services. Potentially they will start to incorporate lots of different data sources. Not only traditional data, but geospatial data, weather data and social media data too.

This would enable banks to deliver a rich set of services to their existing customers through the open banking API and potentially monetise them.

 

Moving data from existing systems to the new database

 

Most banks will have several tools which extract data out of systems and populate business information systems and data warehouses.

Extracting data from traditional systems, transforming it and blending it so that you can use it in these new, agile scale out systems however requires something different. A lot of older tools, which have been very good at doing extracting data, aren’t effective at new style transformation processes.

One tool which is effective at this is Pentaho, which specialises at transforming data from traditional sources and then blending different data sources so that they can offer a richer set of services.

 

Monetizing the API layer

 

Regardless of the approach a bank takes, it will need to support open banking APIs from the start of next year. This leaves little time to become compliant and because that’s just a cost to them right now, we do believe that quickly the more forward-thinking banks will want to extend the capability of those open banking APIs, to develop new revenue streams and monetise them.

We at Hitachi think this is an exciting time, not only for fintech start-ups but traditional banks too, who through these directives, have been given an opportunity to deliver something new to their customers.

 

If you would like to learn more about how Hitachi can help you introduce Open Banking APIs get in touch with via LinkedIn or learn more about our financial services offering here.

As a brand and as a company, Hitachi is known for building high-quality solutions – from proton beam therapy and high-speed, mass transit bullet trains to water-treatment offerings and artificial intelligence (AI) technology – offerings that make the world a better place to live. For this reason, we hold ourselves to the highest of standards and we sweat the details. We know that, in many of these cases of social innovation, failure on our part could have dire or disastrous consequences.

 

Of course, we can’t make the world a better place alone. We need partners who will sweat the details. Partners like Intel, who, with their introduction of the latest Intel Xeon family of processors or their work on computational performance of a deep learning framework, demonstrate their intense focus on innovation, quality and performance.

 

As we continue to examine Intel Xeon family of processors, we see unlimited potential to accelerate and drive our own innovations. New capabilities can help us achieve greater application performance and improved efficiency with our converged and hyperconverged Hitachi Unified Compute Platform (UCP). And, as Intel pushes the envelope even further with next-generation field-programmable gate array (FPGA) technologies as well, we estimate that users could see upwards of a 10x performance boost and a significant reduction in IT footprint.

 

In important vertical markets like financial services, we have seen tremendous success around ultra-dense processing with previous generation Intel processing technologies. There, we are able to capture packets at microsecond granularity, filter in only financial services data, and report on both packet performance plus the financial instruments embedded in the packets. We can’t wait to see how Intel’s latest processing advancements help us exceed expectations and today’s state of the art.

 

We look forward to the road ahead. In the meantime, we’ll keep sweating the details and working with partners like Intel who do the same.

 

The regulatory burdens placed on financial services organisations has reached unprecedented levels.  From data security and access with GDPR to investor protection and the various themes in MiFID II/MiFIR, businesses are besieged by new regulations on an almost monthly basis.

 

According to Business Insider, from the 2008 financial crisis through 2015, the annual volume of regulatory publications, changes and announcements has increased by a staggering 492%. It is an issue I have addressed not only at the numerous events I have spoken at and attended since joining Hitachi, but throughout my career.

 

Understandably organisations are looking for ways to ease this regulatory burden through automating onerous processes, and are looking at ways to make the Risk, Compliance, Operations and Audit (ROCA) line of business more cost effective, efficient and take away the resource burdens that these organisations currently face.

 

After all the business of these organisations is not ROCA, rather they are in the business of generating revenue, which these functions clearly don’t. The need to ease this burden, has seen the rapid rise of RegTech or Regulatory Technology.

 

The idea behind RegTech is that it harnesses the power of technology to ease regulatory pressures. As FinTech innovates, RegTech will be needed to ensure that the right checks and balances are quickly put in place so that organisations do not fall short on their regulatory obligations.

 

RegTech is not just about financial services technology or regulations, it is broader that and can be utilized in numerous industries such as HR, oil & gas, pharmaceutical etc. With RegTech, the approach is to understand the “problem” (be it operational, risk, compliance or audit related), see which regulations it will be impacted by this problem, and solve it using technology.

 

RegTech is a valuable partner to FinTech, although some refer to it as a sub-set of Fintech, in my view RegTech goes hand-in-hand with FinTech - it should work in conjunction with financial technology innovation.

 

RegTech focuses on technologies that facilitate the delivery of regulatory requirements more efficiently and effectively than existing capabilities. RegTech helps to provide process automation, reduce ROCA costs, decrease resource burdens and creates efficiency.

 

FinTech by its nature, is disruptive. It aims to give organisations a competitive edge in the market. When FinTech first took off one of its main disruptions was the creation of algorithmic and high frequency trading systems, at lightening speeds.

 

As these FinTech innovations have become faster, more in depth and more intricate, regulators across the globe have sought to establish some boundaries to prevent fraud, protect consumers and standardise the capabilities of this technology. 

 

The accelerated pace at which FinTech has been adopted and is constantly innovating, means the regulators have struggled to keep up. Now however, far reaching and broader regulations are being established regularly – hence the requirement for RegTech to help manage this plethora of rules and procedures. RegTech is particularly relevant within the ROCA arena, where having oversight of the regulations is deep within their remit.

The financial services industry is heavily regulated, through myriad interlinking global regulations. These regulations are implemented through reports – whether it’s through trade/transaction/ position/periodic reporting or through some sort of disclosure.  Reports are the lifeblood of regulation and are based on data - therefore data is a crucial part of compliance. 

 

At the core of most regulations is the need for financial services organisations to locate, protect and report on the data and information held within their systems.  The regulations require not just audit trails, but each report must demonstrate exactly how data is handled both internally and externally. 

 

Reporting and regulation is unavoidable for all financial services organisations.  FinTech, which is just developing and not regulated yet, will catch up very quickly, as the regulators quicken their pace in keeping up-to-date with innovation and possible disruptions.

 

The challenge is collating and curating this level of information from the existing systems within the banks, within the deadlines specified by the regulations. This why RegTech exists and plays such a key role.

 

At a very fundamental level, RegTech helps financial services organisations to automate many of the manual processes, especially those within legacy systems, whether that be reporting, locating customer data, transactional information or systems intelligence. 

 

The crucial element here is not only the legacy and aging systems still held within many financial institutions - where data is stored in everything from warehouses to virtual arrays, and therefore locating and retrieving information from such becomes a huge challenge - but the legacy thinking of leadership in organisations is also problematic.

 

Many of these organisations are led by individuals whose only thought is the next 6 months. As Warren Buffet, however stated “someone is sitting in the shade today because someone planted a tree a long time ago.” Leadership need to think strategically.

 

The Recent WannaCry Ransomware attack is a perfect example of the dark side of legacy thinking and systems. Had leadership in those effected organisations made strategic infrastructure investments, replacing existing systems which are vulnerable to attack with modern systems implemented with the correct governance, systems and controls, this attack would not have caused as much harm as it did.

 

By using RegTech to automate these tactical and manual processes, it streamlines the approach to compliance and reduces risk by closely monitoring regulatory obligations. Vitally, it can lower costs by decreasing the level of resource required to manage the compliance burden. And RegTech can do so much more than just automate processes.

 

Organisations are using it to conduct data mining and analysis, and provide useful, actionable data to other areas of the business, as well as running more sophisticated aggregated risk-based scenarios for stress-testing, for example.

 

Deloitte estimates that in 2014 banks in Europe spent €55bn on IT, however only €9bn was spent on new systems. The balance was used to bolt-on more systems to the antiquated existing technologies and simply keep the old technology going. 

 

This is a risky and costly strategy. The colossal resource required to keep existing systems going, patched and secure, coupled with managing the elevated levels of compliance requirements will drain budgets over time. Beyond that, the substantial risk associated with manually sourcing data, or using piecemeal solutions presents the very real risk of noncompliance.

 

RegTech is not a silver bullet and it is not going to solve all the compliance headaches businesses are suffering from. However, as the ESMA (European Securities Markets Authority) recently stated firms must “embrace RegTech, or drown in regulation”.

 

RegTech will play a leading role, especially when used to maximum effect. Take, as an example, reporting.  We know through our research than this is an industry-wide challenge; on average a firm has 160 reporting requirements under different regulations globally, each with different drivers and usually with different teams producing those reports.

 

By using RegTech, not only could those team resources be reduced, but the agility and speed with which reports can be produced will ensure compliance deadlines are adhered to. Additionally, resources can then be focused elsewhere, such as on driving innovation and helping to move the company forward. 

 

Rather than focusing on what a burden the regulations are, by using RegTech organisations will see them as an opportunity to get systems, process and data in order, and to use the intelligence and resources to drive the company to greater successes. To take it one step further, I believe regulation does not hinder or stifle innovation - but in fact breeds creativity and innovation.

 

If you would like to learn more about RegTech and my work with Hitachi follow me on Twitter and LinkedIn.

Last week I visited Hannover Messe, the world’s largest Industrial Automation & IoT Exhibition for the first time and I have to say, I was overwhelmed by the sheer size and scale of the event. With 225,000 visitors and 6,500 exhibitors showcasing their Industrial Automation & IoT capabilities, the race is definitely on to take a share of the massive future of the IoT market opportunity in these sectors. There are varying estimations but according to IDC the IoT market is expected to reach $1.9 Trillion by 2021.

 

What I learnt in Hannover was that the industrial and energy sectors are on the cusp of a huge digital data explosion. Why? Because like all industries they are under pressure to innovate and embrace new technologies that will significantly accelerate the intelligence, automation and capabilities of factory production lines, reduce manufacturing defects and fault tolerance levels, as well as improve the reliability, performance and TCO of all kinds of industrial machinery as they become digitised. Technologies that will drive this data explosion will include, machine generated data, sensor data, predictive analytics data, as well as enhanced precision robotics and Artificial Intelligence data. All of this data will give a competitive edge and valuable insight to companies who deploy these technologies wisely and who use the data generated in the right way to drive their business forward intelligently and autonomously.

 

These innovations arguably make Hannover Messe one of the most relevant exhibitions in the IoT space today and last week Hitachi was able to showcase the full power of its Industrial Automation and IoT capability. This included Lumada IoT solutions, real industrial assets, advanced research and it’s humanoid robotics vision through EMIEW, a customer service robot being developed by Hitachi to aid society in public and commercial places with information whatever your language, which generated huge interest from attendees.

 

Hitachi had a large team of IoT experts present who talked very deeply to the technologies and use cases its customers need to advance and digitise their businesses. To say Hitachi inserted itself into the IoT conversation last week is an under-statement, Hitachi is serious about this business and this was further reflected through the extensive global brand advertising campaign in and around the show which included prominent adverts in Hannover’s main railway station, Hannover Messe sky walkways and a number of global media publications, all driving the 225,000 visitors to it’s 400sq metres booth to experience its IoT solutions.

 

As I left Hannover, I came away with two key takeaways. Firstly, IoT is with us here and now, and with this broad level of investment being made by companies and the focus and the potential returns for businesses, you can start to understand how it will drive the next huge wave of industrial change. My second takeaway is the potential Hitachi has to be a dominant force in IoT and the ambition it has to be the market leader. Last week the company made a giant stride towards achieving that goal. You can follow the conversation on Twitter by searching #HM17.

 

Image-1.jpg

Help! Thinking Differently

Posted by Scott Ross Employee Mar 20, 2017

 

Help! The Beatles said it best. We'll come back to that later.

 

Beep! Beep! Beep! My alarm wakes me up and the sudden realisation sets in. Like many other money savvy 20 somethings a calendar reminder on my smartphone alerts me that, on this occasion, my car insurance expires soon. The inexorable battle through the web of compare-supermarkets online is about to commence. Before I set about my task I go into the kitchen to pour a cup of tea and discover that my milk is off. I nip across to the shop to pick up some more and pay for it with my smartphone. I pour my tea, take a deep breath and off I go.

 

Naturally I start with my current provider and run a renewal quote on their website. Having spent the past 12 months as their customer I had high hopes for this to be the standard for others to compete with. After some not-so-easy site navigation and a lot of persistence I managed to get a renewal quote. Shockingly, this was significantly more than my current agreement despite nothing changing other than my age. Having struggled through their website for the best part of an hour re-entering my personal information that they could (and should) have very easily auto-completed for me, needless to say I was far from pleased with the outcome.

 

 

Next, to the plethora of comparison sites. I use a quick google search to review which is best. I select my suitor and off I go. I discover that this website is significantly easier to navigate which somewhat alleviates the painstaking process of having to enter the exact same details that I’ve already spent the best part of my morning entering into my current provider’s site. That pleasing process was enhanced further by the next page, the results! They were staggering. A large number of providers were offering a greater level of service at a considerably lower price. “How can that be”, I asked myself? For now I had to focus on which offer was best and ponder the fact later.

 

I review the top three policies independently, through both a google search and by using the site’s own review tool, and finally settle on my desired option. Two clicks later and I’m on the new provider’s website, details already filled in, quote as per the comparison site and a blank space waiting to complete the transaction. All I needed to do now was fill in my payment details and it was complete. Easy. I would have a new provider once my old contract ends…or so I thought!

 

 

Having settled on a new provider I go about cancelling my current service before the auto-renewal kicks in and I lose my hard-earned new policy. I call the contact centre, give my details and ask to cancel. The operator asks a few questions about “why” and then begins to offer discounts and price matching against what I’ve just signed up to. Why couldn’t they offer this level of service upfront? Why does it take me leaving for them to offer something better? In today’s economy where not just the savvy, but everybody is looking to get more for their money, why would a business continually act like this? This, in my opinion, shows a poor level of customer knowledge and more importantly a poor customer experience.

 

Quickly I begin to realise that many organisations across all consumer industries are acting in a similar way. In fact only the ‘new-age’ organisations can offer something different and even then are they maximising their potential? This got me thinking (back to the title) “Help! I need somebody, not just anybody!” I need my current provider to look after me. To help me. Even better, do it for me. I need them to navigate through the renewal journey for me. To offer me a bespoke service, price, whatever…designed to meet my needs, my characteristics. Act in my best interest. Maybe this is a euphoria / utopia that we may never get to however I can’t help but imagine a world where ‘the man’ is looking out for me. Providing targeted messaging about me, my spend, how and where to spend better, wiser, cheaper. Unlike The Beatles, most organisations aren’t drowning in their own success and, instead, are screaming out for a different kind of help! But what if they weren’t? Imagine a world where your bank offers you a discount to spend at your regular coffee spot, knows you’re paying above average on your street for home insurance and provides an alternative, automatically moves your savings to the best available rate, suggest alternative insurance products based on your driving style/health/lifestyle, the list is endless.

 

 

The point of this story is the power of insight, experience and Internet of Things (IOT). If our providers harnessed the data they already have (or could have) and turned this into valuable information, they would be more relevant to us and in return (we) would be better off. We are, as a consumer, looking for greater value and what better way than our existing providers changing the game. One example could be taking the comparison game to us - offering their services bespoke to our needs, after all they already know us. Another could be to improve the journey through their website, making it easier to transact. What if my bank knew my milk was already off and alerted me to buy more and attached a special offer to their message?! By empowering their staff, systems and processes even the oldest traditional organisations can realise the advantage. Increasing their customer insight and ultimately improving customer experience will bring about new markets, greater revenues and thriving customer loyalty.

 

Don't miss my next blog to see if we can work it out!

 

 

If you would like to learn more about Hitachi and how we can help Financial Service organisations click here.

 

To learn more about me visit my LinkedIn profile here.

Why Digital Transformation must be a strategic priority in 2017

 

It’s with good reason that Digital Transformation has become the latest watchword in our industry; organisations across the world are finally seeing the profound business advantages of leveraging digital technology.

According to a 2016 Forbes Insights and Hitachi survey of 573 top global executives, Digital Transformation sits at the top of the strategic agenda. Legendary former General Electric CEO, Jack Welch sums up perfectly why Digital Transformation has become a board-level priority: “If the rate of change on the outside exceeds the rate of change on the inside, the end is near.”

 

Organisations are seeing such an unprecedented rate of change all around them, that Digital Transformation is no longer a ‘nice to have’; it is a ‘must have’ for corporate survival.  To discover the state of enterprises’ Digital Transformation project in 2017, Hitachi partnered with UK technology publication, Computer Business Review (CBR), to survey IT decision makers in its readership on their efforts. While not scientifically representative of enterprises across the UK or Europe, the research provides some enlightening anecdotal evidence.

 

In this blog, I’ll explore some of those findings and discuss why I think 2017 will be the year of Digital Transformation.

In the UK, just under two-thirds of CBR readers revealed they are through the emergent stages of Digital Transformation, and consider their organisation to be at an intermediate stage in their journey. Only one in ten described themselves as beginners, with one in four stating they are leading the pack when it comes to transforming their businesses.

 

blog 2.png

We’ve found similar scenarios within many of our customers. Some are teetering on the edge, while others, such as K&H Bank, the largest commercial bank in Hungary, are already reaping the rewards. Through upgrading its storage solutions, K&H Bank has halved the time it takes for new business information to arrive in its data warehouse, ready for analysis and cut its data recovery time by half. This enables H&K Bank to get quicker insights into its business and react faster than its competitors.

 

It is exactly this type of optimisation that is fuelling Digital Transformation. By cultivating improved internal processes and competencies it drives tangible business benefits. In fact, according to CBR readers, just under two-thirds identified improving internal operations as the top driver for Digital Transformation, while a quarter highlighted customer experience.

 

Of course, while Digital Transformation can provide both optimised operations and improved customer experience, by initially focusing on internal programmes, any issues can be overcome and learnings understood.   Take for example, Rabobank in the Netherlands. The finance and services company has transformed its compliance operations by optimising its operations through a new platform. This strategy enables simplified access to structured and unstructured data needed for investigations, easing the regulatory burden on the bank.

 

blog pic 1.png

 

This kind of Big Data analysis combined with other technologies such as cloud computing and the Internet of Things (IoT), are at the core of many successful Digital Transformation stories. Cloud computing for example, was cited by 67% of readers surveyed as helping them to progress along their digital journey.

 

Indeed, our customers have demonstrated a keen interest in cloud technology as an integrated element of a Digital Transformation strategy. Deluxe, a US-based finance organisation, is benefitting from improved flexibility, security and control through Hitachi’s enterprise cloud offerings. By moving to a private cloud within a managed services environment, it now has the technology to integrate acquisitions, deploy next-generation applications and accelerate its time-to-market.

 

Other technologies, such as data analytics, cited by 20% and IoT cited by 10% of readers, are likely to grow in popularity as more powerful technology is developed. Although Artificial Intelligence (AI) is increasing in awareness, with innovative rollouts from organisations such as Enfield Council, it is not currently a strategic focus for UK businesses as on their Digital Transformation journey - cited by only 3% of readers.  This is likely to change however as more and more applications for the technology are discovered.

 

What our survey highlighted was not if organisations are starting and progressing their Digital Transformation journey, but when and how far they are along the path. That’s not to say it’s easy. But there is help along the way - my colleague Bob Plumridge recently shared three excellent pieces of advice regardless of where you are in your journey.  And, most importantly, the rewards are worth it. Improving internal operations and processes will help drive increased innovation and therefore improve customer experience. Embarking on Digital Transformation will also help keep your pace of change ahead of the competition, just like Jack Welch advised.

Last week my team hosted an exciting event at the Four Seasons in Houston, TX progressing our efforts in this vertical.  It was an event that mixed users, partners and customers plus the many faces of Hitachi.  Our aim was two pronged:

  1. Be inspired through the continued exploration of new challenges from the industry, and
  2. Validate areas we're already progressing, and adjusting based upon user feedback.

Doug Gibson and Matt Hall (Agile Geoscience) kicked us off by discussing the state of the industry and various challenges with managing and processing Seismic data.  It was quite inspiring and certainly revealing to hear where the industry is investing across Upstream, Midstream and Downstream -- the meat, Upstream used to be king, but investments are moving to both Midstream and Downstream.  Matt expressed his passions about literally seeing the geological progression of the Earth through Seismic Data. What an infectious and grand meme!

full_section_earth_surface_included.png?format=2500wMore generally, I believe that our event can be seen as "coming out party" for works we began several years ago -- you'll continue to hear more from us as we work our execution path.  Further, being inspired by one Matt Hall we ran a series of un-sessions resulting in valuable interactions.

 

O&G Summit - 11.jpg

The Edge or Cloud?

In one of the un-sessions, Doug and Ravi (Hitachi Research in Santa Clara) facilitated a discussion about shifting some part of analytics to the edge for faster and more complete decision making.  There are many reasons for this and I think that the three most significant are narrow transmission rates, large data (as in velocity, volume and variety), and tight decision making schedules.  Even though some processes (especially geologic ones) may take weeks, months or years to conclude when urgency matters a round trip to a centralized cloud fails!  Specifically, HSE (Health, Safety and Environment) related matters, plus matters related to production of both oil and gas mandate rapid analysis and decision making.  Maybe a better way to say this is through numerical "orders of magnitude" -- specific details are anonymized to "protect the innocent."

  • Last mile wireless networks are being modernized in places like the Permian Basin with links moving from satellite (think Kbps) to 10Mbps using 4G/LTE or unlicensed spectrum.  Even these modernized networks may buckle when faced with terabytes and petabytes of data on the edge.
  • Sensing systems from companies, like FOTECH, are capable of producing multiples of terabytes per day, which join a variety of other emerging and very mature sensing platforms.  Further digital cameras are also present to protect safety and guard against theft.  This means that the full set of Big Data categories (volume, velocity and variety) exists on the edge.
  • In the case of Seismic exploration systems, used to acquire data, designs include "converged-like" systems placed in ISO containers to capture and format Seismic Data potentially up to the scale of 10s of petabytes of data.  Because of the remote locations these exploration systems operate in there is a serious lack of bandwidth to move data from edge to core over networks.  Therefore, services companies literally ship the data from edge to core on tape, optical or ruggedized magnetic storage devices.
  • Operators of brown-field factories with thousands of events and tens of "red alarms" per day desire to operate more optimally.  However, low bit rate networks and little to no storage in the factory, to capture the data for analysis, suggest something more fundamental is needed on the edge before basic analysis of current operations can start.

This certainly gets me to think that while the public cloud providers are trying to get all of these data into their platforms there are some hard realities to cope with.  Maybe a better way classify this problem is as trying to squeeze an elephant through a straw!  However, many of the virtues of cloud are desirable so what we can we do?

 

Progressing Cloud to the Edge

Certainly the faces of Hitachi have (industry) optimized solutions in the market already that enrich data on the edge, analyze + process to skinny down edge data, and business advisory systems capable of improving edge related processes.  However, my conclusion from last week is that resolutions to these complex problems are less about what kind of widget you bring to the table and more about how you approach solving a problem.  This is indeed the spirit of the Hitachi Insight Group's Lumada Platform because it includes methods to engage users, ecosystems and brings tools to the table as appropriate.  I was inspired to revisit problem solving (not product selling) because Matt Hall said, "I was pleased to see that the Hitachi folks were beginning to honestly understand the scope of the problem" as we closed our summit.

 

Is O&G the poster child for Edge Cloud?  It seems that given the challenges uncovered during our summit plus other industry interactions the likely answer is yes.  Perhaps the why is self evident because processing on the edge, purpose building for the industry and mixing in cloud design patterns is obvious as stacks are modernized.  It is the "how" part I believe deserves attention.  Using Matt's quote, from the last paragraph, guides us on how to push cloud principals to the edge.  Essentially, for this industry we must pursue "old fashioned" and sometimes face-to-face interactions with people that engage in various parts of the O&G ecosystem like geologists, drilling engineers, geophysicists, and so on.  Given these interactions which problems to solve, their scope and depth become more obvious and even compelling.  It is then when we draft execution plans and make them real that we will resolve to build the cloud at the edge.  However, if we sit in a central location, read and imagine these problems we won't develop sufficient understanding and empathy to really do our best.  So, again yes Oil and Gas will engender edge clouds, but it is the adventure of understanding user journeys that guides us on which problems matter.

 

Attributions

  1. Top Banner Picture - Author: Stig Nygaard, URL: Oil rig | Somewhere in the North Sea... | Stig Nygaard | Flickr, License: Creative Commons
  2. Seismic Image - Relinked from Burning the surface onto the subsurface — Agile, and from the the USGS data repository.

Data is the New Oil

Posted by Harry Zimmer Employee Jan 31, 2017

There are at least 20 terms today that describe something that was once called Decision Support (more than 30 years ago). Using computers to make fact based intelligent decisions has continually been a noble goal for all organizations to achieve. Here we are in 2017 and the subject of Decision Support has been by enlarge forgotten. The following word cloud shows many of the terms that are used today. They all point to doing super advanced Decision Support. Some of the terms like Data Warehouse have been replaced (or super-ceded) with the concept of the Data Lake. Others old terms like Artificial Intelligence (AI) have been re-energized as core to most organizations IT plans for this year.

 

Picture1.png

 

In discussing this whole set of topics with many customers around the world over the past 12 months, it has become clear to me that in general most companies are still struggling with the deployment and in some cases the ROI around this whole set of topics. The education levels are also all over the map and the sophistication of systems is inconsistent.

 

In my upcoming WebTech (webinar) I will be sharing a new model that takes into account the old and the new. It will provide some foundational educationl in under an hour that should provide incredible clarity – especially for ‘C’ level executives.

 

I have developed what I think is a very useful model or architecture that can be adapted and adopted by most, if not all organizations. This model provides an ability to self-assess where the organization is exactly and what the road forward will look like. All of this has been done with the goal of achieving the state of the art goals in 2017.

 

The model is a direct plug-in to the industry-wide digital transformation initiative. In fact, without the inclusion of the model – or something similar to it – a digital transformation project will most likely fail.

 

The other direct linkage is to another hot topic: Internet of Things (IoT). Here too there is a direct linkage to the model. In fact, as IoT becomes mainstream across all organizations, it will be a valuable new source of data using the evolving world of sensor technologies.

 

I hope you are able to join me for this WebTech. I am sure you will find that it will be extremely valuable to you and your organization and spur a ton of follow-on discussion.

 

To register for my upcoming WebTech, click here.

 

For additional readings:

  • Storytelling With Data, by Cole Nussbaumer Knaflic
  • 80 Fundamental Models for Business Analysts, by Alberto Scappini
  • Tomorrow Today, by Donal Daly
  • Predictive Analytics for Dummies, by Bari, Chaouchi, & Jung

In Superman 30, the Man of Steel theorises that the latest incarnation of his enemy Doomsday, emits so much energy that when he emerged he boiled the ocean. Not an easy task, even for a super villain; and certainly out of reach for mere mortals.

 

So why are some enterprises taking this approach with Digital Transformation projects? Moreover, if overreaching doesn’t work what steps should be taken?

 

Hitachi recently partnered with Forbes Insights, interviewing nearly 600 C-level executives from North America, Latin America, Europe and Asia-Pacific. The global research revealed that a transition toward digital maturity involves five major steps, some of which are proving easier to take than others.

 

1. Top Strategic Priority

Half of executives polled said their organisation will be vastly transformed within two years. I expect the figure is actually higher in Europe, where companies are already on the move. One bank we work with even has a permanent board member dedicated and responsible for its Digital Transformation.

 

The realisation has dawned in boardrooms that growth and survival are now tied-up with digital capabilities.

 

2. Enterprise-wide approach

The research revealed that cross-functional teams are not adequately involved in developing or implementing strategy, with the bulk of this work done by IT. In our experience this is no longer the case across Europe or the Middle East. According to Gartner for example, shadow IT investments, purchases outside of CIO control, often exceed 30 percent of total IT spend.

 

I recently attended an IDC event in Dubai dedicated to Digital Transformation in the banking and finance sector. The congress was dominated by line of business executives from sales and marketing rather than IT leaders. Each session and attendee I spoke with shared an active interest in making Digital Transformation, the cornerstone of their company strategy.

 

3. Focused on business outcomes

The ability to innovate was the top measure of success for 46% of companies polled, and it’s something I hear a lot from customers.

 

The ability to innovate cannot be achieved through technology alone, enterprises instead should seek partners who they can trust to solve the underlying technical and architectural challenges to deliver a solution that addresses and enables business outcomes.

 

161107.highres.digital.jpg

 

One thing the report does not consider however, is what will happen to those who fail to invest in digital capabilities. Failure to modernise cyber security systems for example, is an issue regularly covered by media outlets. Prof Richard Benham, chairman of the National Cyber Management Centre has even predicted that in 2017 "a major bank will fail as a result of a cyber-attack, leading to a loss of confidence and a run on that bank." 

 

Digital Transformation isn’t essential to just growth, but survival too.

 

4. Untapped potential of data and analytics

Only 44% of companies surveyed see themselves as advanced leaders in data and analytics. In my opinion, this is a conservative estimate. Some businesses may be guarding achievements – as poker players do. The term poker face relates to the blank expressions players use to make it difficult to guess their hand.  The fewer the cues, the greater the paranoia among other players.

 

You could speculate that some businesses may be keeping their best weapons secret too.

Besides, nobody wants to be that company who brags one day and is overtaken the next.

 

But we should take comfort from those companies in Europe making solid progress. K&H Bank, the largest commercial bank in Hungary, has halved the time it takes for new business figures to arrive in its data warehouse, ready for analysis and cut its data recovery time by 50%. Or consider Rabobank in The Netherlands, which has gained “control of the uncontrollable” and mastered the handling of its data for compliance purposes.

 

5. Marrying technology with people power

When dealing with challenges associated with Digital Transformation, the survey found that people are organisations’ biggest obstacle. Investing in technology is ranked lowest – indicating companies may already have the technology but not the skills. Support from strategic technology partners can help bridge the gap here.

 

These obstacles also bring me back to my original warning – don’t try to boil the ocean. An organisation might race ahead and invest heavily in technology but without the right culture and know-how, it could waste an awful lot of money at best, and lose senior support at worst.

 

So, what can your business learn from this research? Here are three things that you could do, within a safe temperature range:

 

  1. Hire new talent to challenge the status quo. Your current team may not have the fresh vision needed to shift your enterprise to mode 2 infrastructure. You need an intergenerational mix of young, enthusiastic staff and seasoned experts
  2. Nominate a senior executive as the Digital Transformation ambassador. Organisations need a senior sponsor to push the agenda. To overcome engrained ways of doing things, you need people with strong messages that can cascade down
  3. Be bold and take calculated risks. One bank in Europe has even banned its CIO from buying mode 1 IT infrastructure – meaning the bank has no choice but to embrace a more agile digital environment (rather than fall back on the devil it knows). Another bank in The Netherlands took the bold step of replacing its hierarchical structure with small squads of people, each with end-to-end responsibility for making an impact on a focused area of the business.

 

To achieve Digital Transformation, enterprises need to push the internal idea of ‘safe’ as far as possible. As Mark Zuckerberg declared “in a world that’s changing really quickly, the only strategy that is guaranteed to fail is not taking risks”. If a business does so iteratively and learns from its mistakes, it won’t run the risk of trying to boil the ocean and failing from the sheer magnitude of the task.

 

If you would like to learn more about the research, I recommend you download the full report here.