New generations of business applications centered on the value that can be derived from data are powering digital transformation. These applications use algorithms under the covers to advise us and to help us make better decisions -- and increasingly – to literally run the business. Algorithms underpin getting people from point A to point B (Uber; Kayak; Waze); they suggest the fastest way to procure everything and anything (Alibaba, Amazon); where to find a job (LinkedIn), where to find new customers (Marketo) and even where to find friends (Facebook).
Contrast these with traditional enterprise business applications, as Wing VC Co-Founder Peter Wagner puts it in his essay titled “Why Data-First Applications Will Come to Rule Enterprise Software,” that have business-logic at their core. These applications were programmed to do what the business decided they should do. Their value is based on driving efficiencies and automating best practices learned over decades of trial and error, essentially scaling and accelerating the human workforce (ERP, SCM, CRM). These powerhouse applications have delivered proven, stalwart value over time, yet their relational architecture can be a stumbling block to taking advantage of algorithms for modern analytics.
Data-value-driven analytics applications are empowering line of business users outside of the centralized IT construct to perform complex analysis and become arm-chair (or rather office chair) data scientists, using their newly minted scientist superpowers to uncover new opportunities and drive incremental business value to fuel digital transformation. Supplying the magic behind these superpowers are data scientists working behind the scenes with application developers to embed algorithms along with their own expertise into analytics applications.
Analytics applications are data and algorithm-intensive and as such they can consume vast computing resources. Analytics applications will benefit from new(ish) types of computing architectures and technologies like in-memory computing (IMC). With IMC, the application assumes all data required for processing is available from main memory and provides processing power to do analytics on very large datasets – as Gartner says, in their report Invisible Advanced Analytics: Coming to a Business Application Near You,“…at a frequency and level of granularity not possible with traditional computing architectures.”
Another technology -- a cross between analytics and transactional processing -- is hybrid transactional/analytical processing or HTAP. HTAP is designed to eliminate the need for users to switch between transactional and analytics application types and get the benefits of both. In the same report, Gartner asserts that the lines between transaction processing and analytics will begin to blur when these two technologies are combined.
Yet another complementary technology being incorporated in analytics applications is Hadoop, which can act as a repository to support analytics with external data, owing to its ability to ingest data without a predefined schema.
This is an exciting time to be at Hitachi if you’re interested in analytics, data-value-driven applications and the compute infrastructures that run them. Hitachi group companies are into this space big time. Pentaho integrates multiple data types to create agile, extensible data sources, Hitachi Insight Group develops Internet of Things (IoT) solutions and Hitachi Data Systems offers converged and hyperconverged systems, flash storage, object storage, cloud and integrated solutions to run applications, including the most data-intensive and advanced analytics and databases from SAP, Oracle, Pentaho and many more.