Skip to main content (Press Enter).
Skip auxiliary navigation (Press Enter).
Skip main navigation (Press Enter).
Explore All Communities
Become an Advocate
Earn Points and Badges
Start a Discussion
Neuromorphic Computing For Data and Edge Computing
In previous post, I have written about
Data Centric Computing
, the movement to offload data management functions from CPUs to smart NICs and FPGAs, or DPUs (Data Processing Units) as NVIDIA calls them so that the CPUs could focus more of their power on application processing.
Another approach to Data Centric Computing is the use of computational storage as explained in a post by
Stacy Peterson in SearchStorage
where computation is moved closer to storage to reduce the amount of storage that moves between storage and compute. This is being driven by the need to reduce latency in IoT and edge devices that are required to handle massive amounts of data. Steve Garbrecht explains how
Lumada Edge brings DataOps to the Edge
in his post.
Hitachi is also working with Intel in developing neuromorphic hardware to distribute processing across various infrastructure elements which could mean less reliance on centralized systems that require constant high (expensive) bandwidths. Neuromorphic hardware is an electronic device which mimics the natural biological structures of our nervous system. It is an attempt to replicate the cognitive abilities of our brains to process information faster and more efficiently than computers due to the architecture of our neural system.
This sound a little far out, but in March of this year, Intel announced the Pohoiki Springs system, shown here, which comprises about 770 neuromorphic research chips, each with 130,000 neurons, inside a chassis the size of five standard server. It has a computational capacity of about 100 million neurons, roughly similar to the brain of a mole-rat.
Unlike traditional CPUs, in the Pohoiki Springs system, the memory and computing elements are intertwined rather than separate. That minimizes the distance that data has to travel, because in traditional computing architectures, data has to flow back and forth between memory and computing.
With neuromorphic computing, it is possible to train machine-learning models using a fraction of the data it takes to train them on traditional computing hardware. That means the models learn similarly to the way human babies learn, by seeing an image or toy once and being able to recognize it forever. The models can also learn from the data, nearly instantaneously, ultimately making predictions that could be more accurate than those made by traditional machine-learning models.
Last year Hitachi joined the
Intel Neuromorphic Research Community
). Hitachi has joined forces with Accenture, Airbus, GE, Intel and other INRC members to create proof-of-concept applications that will bring the most value to their businesses. Intel will leverage the insights that come from this customer-centric research to inform the designs of future processors and systems. These engagements will ensure Intel remains strategically positioned at the forefront of neuromorphic technology commercialization.
Hitachi is unique in the way it combines information technologies (IT) including AI, big data analytics and other digital technologies; operational technologies (OT) for system control and operation; and an extensive range of products. Through its
Social Innovation Business
, Hitachi is providing digital solutions to help resolve challenges faced by customers and society.
“Intel’s Loihi and Spiking Neural Networks (Loihi is the research chip in the Pohoiki Springs System which includes 130,000 neurons optimized for spiking neural network) have the potential to recognize and understand the time series data of many high-resolution cameras and sensors quickly,” said Norikatsu Takaura, chief researcher of the Research & Development Group at Hitachi Ltd. “Neuromorphic computing and its technology stack will improve the scalability and flexibility of edge computing systems.”
In order to gain insight into electrical circuits and biological processes, neuromorphic engineers require interdisciplinary knowledge of biology, physics, math, which plays to the strength of Hitachi’s Social Innovation business. This is a fast growing area. Analysts forecast the neuromorphic computing market could rise from $69 million in 2024 to $5 billion in 2029 – and $21.3 billion in 2034
Three Computing Approaches To AI And Deep Learning
How is Data Ops Related to Data Centric Computing?
Moving to Data Centric Data Centers and DPUs
The AI Revolution Requires Accelerated Compute
Should You Be Concerned With Quantum Computing in 2020?
A proud part of
Code of Conduct
© Hitachi Vantara LLC 2021. All Rights Reserved.
© Hitachi Vantara Corporation. All Rights Reserved.
Powered by Higher Logic