There once was a king of Israel named Solomon, who was the wisest man on earth. One day he was asked to rule between two women, both claiming to be the mother of a child. The arguments on both sides were equally compelling. How would he decide this case? He ordered that the child be cut in halve so that each woman would have an equal portion of the child. One mother agreed while the other mother pleaded that the baby be spared and given to the care of the other women. In this way King Solomon determined who was the real mother.
If we had submitted this to an AI machine would the decision have been any different?
Solomon’s Code is a book that was written by Olaf Groth and Mark Nitzberg and published in November of last year, so it is fairly up to date on the recent happenings in the AI world. “It is a thought provoking examination of Artificial Intelligence and how it will reshape human values, trust, and power around the World.” I greatly recommend that you read this book to understand the potential impact AI will have on our lives, for good or bad.
The book begins with the story of Ava who is living with AI in the not too distant future. AI has calculated her probability of developing cancer like her mother and has prescribed a course of treatment tied to sensors in her refrigerator, toilet, and ActiSensor mattress. Her wearable personal assistant senses her moods. The Insurance company and her doctor put together a complete treatment plan that would consider everything from her emotional well-being, her work activities, and even the friends that she would associate with. Her personal assistant makes decisions for her as to where she goes to eat, what music she listens too, and who she calls for support.
As we cede more of our daily decisions to AI, what are we really giving up? Do AI systems have biases? If AI models are developed by data scientists whose personality, interests and values may be different than an agricultural worker or a factory worker, how will that influence the AI results? What data is being used to train the AI model? Does it make a difference if the data is from China or the United Kingdom?
The story of Solomon is a cautionary tale. He built a magnificent kingdom, but the kingdom imploded due to his own sins and it was followed by an era of violence and social unrest. “The gift of wisdom was squandered, and society paid the price”
The Introduction to this book ends with this statement.
‘Humanity’s innate undaunted desire to explore, develop, and advance will continue to spawn transformative new applications of artificial intelligence. The genie is out of the bottle, despite the unknown risks and rewards that might come of it. If we endeavor to build a machine that facilitates our higher development -rather than the other way around – we must maintain a focus on the subtle ways AI will transform values, trust, and power. And to do that, we must understand what AI can tell us about humanity itself, with all its rich global diversity, its critical challenges, and its remarkable potential.”
This book was of particular interest to me since Hitachi’s core strategy is around Social Innovation.Where we will operate business to create three value propositions: improving customer’s social values, environmental values, and economic values. In order to do this we must be focused on understanding the transformative power of technologies like AI for good or bad.