Lothar Schubert

When Hinton Met Newton...

Blog Post created by Lothar Schubert Employee on Aug 9, 2018

Even with deep learning (DL), physics-based models (PBMs) still have their place.

Rule-based systems can’t scale. Try building rule-based autonomous cars: You can get pretty far with a few rules – literally actually, on a straight lane, in perfect conditions. However add poor visibility, unexpected objects, or sensor malfunctions, and there’s always “just one more” rule needed to deal with those exceptions. Problem of course, there always is one more rule. And the more rules you add, the more dependencies you need to account for. It’s a race you cannot win. Enter the world of deep learning. Rather than hard-coding the rules, learn the rules – and keep re-learning them, as rules are meant to be broken (isn’t that how we learned as children?!). Problem solved. Or, not so fast.

Despite many arguable successes across industries, deep learning doesn't quite live up yet to its general hype [Read: "But for all the recent investment, the scope of AI deployment has been limited so far."]. This is, because deep learning comes with its own set of issues, of course: The huge appetite for data, including especially also for labeled training data; ethical issues such as trained biases and algorithmic transparency; skills shortage (evidenced by the fact that ML start-ups often are acquired as much for people as for technology); and the high demand on IT infrastructure (compute, network, storage) to train models (which increasingly may include generation of simulated training data).

Increasingly, deep learning (think: Geoffrey Hinton) and physics-based models (think: Isaac Newton) are learning (yup) how to work together. The approach is: engineer what you know, and learn what you don't. There's ample scientific research and there are well established engineering methods, no need to throw those overboard. At the same time, there is the opportunity to augment those methods. And to challenge some of the conventional truths.

Two examples for illustration:

For autonomous driving on plane surfaces in good weather conditions, the dynamic model of an Ackerman steered vehicle can be obtained according to Newton's laws. That's a good starting point. Conventional wisdom assumes that during challenging driving conditions such as during blizzards it's a good idea to reduce speed for safety. While that may be true, application of deep learning can help optimize driving in those situations based on learned conditions, resulting in recommendations for optimal (not always: slowest) driving behavior.

For commercial aircraft, one key issue is the lack of real-life data of catastrophic engine failures. That is because engines are built, tested and maintained to avoid exactly those situations. Physics-based models, applying engineering and material science laws, are absolutely critical in helping to assist with predictive maintenance, e.g. to recommend maintenance or repair based on actual usage. What's more, those models increasingly are used to generate simulated data failures, to feed deep learning loops.

In summary, it's not "either / or" (Hinton or Newton), but it's often "jointly together" (Hinton plus Newton). And probably will stay so for quite some time.

This is one reason, why I'm so excited to have joined Hitachi few weeks ago, after spending most of my career at IT (SAP) or OT (GE) companies. Hitachi is in a very unique spot to pull this off, with Lumada, giving our deep expertise is both, OT and IT (Data & Analytics).

 

Note: this is a re-blog from my public LinkedIn blog, modified with some Hitachi specific thoughts.

Outcomes