AI and Solomon's Code

By Hubert Yoshida posted 05-19-2019 00:00



There once was a king of Israel named Solomon, who was the wisest man on earth. One day he was asked to rule between two women, both claiming to be the mother of a child. The arguments on both sides were equally compelling. How would he decide this case? He ordered that the child be cut in halve so that each woman would have an equal portion of the child. One mother agreed while the other mother pleaded that the baby be spared and given to the care of the other women. In this way King Solomon determined who was the real mother.

If we had submitted this to an AI machine would the decision have been any different?

Solomon’s Code is a book that was written by Olaf Groth and Mark Nitzberg and published in November of last year, so it is fairly up to date on the recent happenings in the AI world. “It is a thought provoking examination of Artificial Intelligence and how it will reshape human values, trust, and power around the World.” I greatly recommend that you read this book to understand the potential impact AI will have on our lives, for good or bad.

The book begins with the story of Ava who is living with AI in the not too distant future. AI has calculated her probability of developing cancer like her mother and has prescribed a course of treatment tied to sensors in her refrigerator, toilet, and ActiSensor mattress. Her wearable personal assistant senses her moods. The Insurance company and her doctor put together a complete treatment plan that would consider everything from her emotional well-being, her work activities, and even the friends that she would associate with. Her personal assistant makes decisions for her as to where she goes to eat, what music she listens too, and who she calls for support.  

As we cede more of our daily decisions to AI, what are we really giving up? Do AI systems have biases? If AI models are developed by data scientists whose personality, interests and values may be different than an agricultural worker or a factory worker, how will that influence the AI results? What data is being used to train the AI model? Does it make a difference if the data is from China or the United Kingdom?

The story of Solomon is a cautionary tale. He built a magnificent kingdom, but the kingdom imploded due to his own sins and it was followed by an era of violence and social unrest. “The gift of wisdom was squandered, and society paid the price”

The Introduction to this book ends with this statement.

‘Humanity’s innate undaunted desire to explore, develop, and advance will continue to spawn transformative new applications of artificial intelligence. The genie is out of the bottle, despite the unknown risks and rewards that might come of it. If we endeavor to build a machine that facilitates our higher development -rather than the other way around – we must maintain a focus on the subtle ways AI will transform values, trust, and power. And to do that, we must understand what AI can tell us about humanity itself, with all its rich global diversity, its critical challenges, and its remarkable potential.”


This book was of particular interest to me since Hitachi’s core strategy is around Social Innovation.Where we will operate business to create three value propositions: improving customer’s social values, environmental values, and economic values. In order to do this we must  be focused on understanding the transformative power of technologies like AI for good or bad.                     





05-24-2019 04:58

Thanks for the link. Yes diversity becomes more important as models and training data can projec biases. if they are developed by the same type of data scientist or engineer. As your link points out the voice on SIRI is female, and her responses could be considered subservient to males. . Ava, the young lady who is living in the future AI age in this book has a personal assistant who speaks with the voice and persona of her ex-boyfriend Substituting the voice and behavior of a personal assistant with someone you know may be a good solution, but since this is an ex-boyfriend this annoys her until she changes  the persona on her personal assistant.

05-23-2019 17:43

The United Nations has expressed concern that Siri and Alexa are perpetuating sexism because the responses are in a female heterosexual submissive voice that accepts flirtation and verbal abuse (examples in the CNN article). This is especially important info for those of you with children who might mimic the AI voice responses. Yet another reason to get more women and people of color involved in AI programming and machine learning.

Here is the article from CNN: