Containers have been fueling the full-stack development implementation in webscale companies for a number of years now. Financial services being risk adverse are typically slow to jump on new trends, but have started adopting containers at a very fast rate now that they are going mainstream. If you’re a financial institution and you aren’t at least taking a serious look at microservices and containers…you might be behind already.
Case in point: FINRA (a non-profit that is authorized by Congress) secures over 3,700+ securities firms. They have to go through 37+Billion transactions per day with over 900 cases of insider trading a year, and over 1500+ companies getting disciplinary actions levied against them. http://www.finra.org/newsroom/statistics.
When you look at year over year, you start to get the picture that fines, fraud and insider trading are growing at as rapid a pace as data and technology change. The amount of data sources you need to traverse in a small amount of time is huge. Going through that many transactions a day (with around 2.5 cases happening each day) means that the queries to run through that much data can take hours unless you factor in containers and the ability to blend data sources quickly, and return results fast. It’s like a data-driven arms race.
This is where containers can help and are already driving the financial services industry across regulatory, operational, and analytical areas. Here are a few areas where I think containers are most impactful:
- Bare Metal – Customers are increasingly looking for bare metal options to quickly spin up and spin down containers and microservices. This helps in two ways. One they get to reduce licensing fees for hypervisors and secondly the speed at which they can utilize the hardware is greater. This buys them the economy of scale, and a good ROI with a software-defined data center (SDDC) and software-defined networking (SDN) being two large drivers of this trend.
- Automation – I’m a huge fan of automation, and when it comes to digital platforms that need little to no human interaction banking and finance are no stranger to this. People are prone to error, where automation is only as fallible as its programming. Traditionally there has been a lot of analysts tied to these banking and finance queries, and having to parse through large amounts of data. One example of automation is the fact that you no longer need to interface with a teller to go to your bank branch. Personal connections and customer interaction are quickly being replaced with the ability to open your mobile phone and transfer that money anywhere you want, pay that bill, or send money to your friends all with the click of a button. I can tell you what I spent, where I spend it, and what category it falls in within seconds. All of this without ever talking to a teller, or needing some fancy analyst. Automation is the answer and it’s no different with containers.
- New Regulations – Governments always want to know where, who, and how that money moves. Compliance and fraud are at an all-time high. Just look no further than the Bangladesh bank where over 80 million dollars was stolen by hackers to realize this is a serious concern and could have been worse. https://www.reuters.com/article/us-usa-fed-bangladesh/bangladesh-bank-exposed-to-hackers-by-cheap-switches-no-firewall-police-idUSKCN0XI1UO. Several hundred million and a misspelling of “Shalika Foundation” to “Shalika Fandation” saved Bangladesh bank from having potentially over 1 Billion dollars stolen. In this case, human hackers not automating helped, but far worse are the cybersecurity risks involved for the bank. They can’t afford to miss any transactions happening anywhere they operate.
- Cybersecurity – Financial security as noted above is a big real-time, data-driven operation that requires tactics and tools that are responsive, and can scale. This is again where container environments thrive. They can help identify and prevent things such as money laundering, intrusions like the one above unless the hackers misspell something and take the human element out of the picture. Cybersecurity threats are on the rise, and it takes nothing more than not keeping up with the latest security patches to have a big impact once they get into your environment. Target, Visa, Sony, Equifax – and their customers - have all learned what can happen with a breach.
- Scale of transactions – As with the FINRA example above, as we get increased access to our money, with more ability to move that money quickly, financial institutions need to keep up. With data growing 10x, and unstructured data growth at 100x, the need to parse through the transactions quickly is becoming ever more challenging. Containers and scale-out micro-services architectures are the keys to solving this puzzle.
I can remember as a kid I had a paper register with how much money I had in it, and once a month or so I could take my allowance to Fifth-Third Bank and they would write my new total, and deposit my money. My mom would also keep her checkbook up to date, and it would have every transaction she ever did, from ATM to checks, religiously kept in it. I can’t tell you the last time I was in a bank, let alone kept a register log. They still send me one, but I think it’s still in the box somewhere with those checks I don’t use often unless forced to. Financial institutions now need to have all my transactions and have them accessible quickly. They need to watch for fraudulent transactions, where I am, how much I’m taking out a day, and what my normal spending pattern looks like to stop identity theft. Tough to do without heavy analytics in real time, even tougher without containers.
So what are the limitations of current systems? Why not just keep doing what we’ve been doing?
There’s the old adage about doing things the way you always have and expecting a different result. VM’s are like my old register and are well suited to those old monolithic applications. Not that there is anything wrong with the way I used to go to the teller to make transactions, it’s just clunky, slow and expensive. VM’s are the equivalent of the teller. They aren’t responsive and they can’t meet the scale of modern distributed systems. Scale-up being the answer in the past (more CPU, more memory, more overhead, expensive licenses, and maintenance). go big or go home doesn’t work in today’s world. These dedicated clusters might work hard sometimes, but more times than not you’re scaling a large system up for those “key” times when you need them. With a highly scalable architecture, you’re able to scale up and down quickly based on your needs, without overbuying hardware that sits idle. I won’t even touch on the benefits of cloud bursting, and being able to quickly scale into the cloud environment.
Secondly, integration for traditional architectures is difficult as you had to worry about multiple applications, and integration environments, drivers, hypervisors, and golden images just to get up and running. How and where the data moved was second to just getting all the parts and pieces put together. Scale-out compostable container architectures that were designed to come together to address specific problems like data ingestion, processing, reactive, networking etc. (e.g. Kafka, Spark, Cassandra, Flink) solve the issues of complex integration. These architectures are centered around scaling, tackling large data problems, and integrating with each other.
So to answer the question whether financial services are ready for containers, the answer is undoubtedly yes. I would almost say they can’t survive without them. Today’s dated VM systems aren’t ready to tackle the current problems, and they certainly don’t scale as well. In my next blog, I’ll go through some stacks and architectures that show how you can get significant results specifically for financial services.