From the BrainSTEM, Science & Technology

‘Westworld’ and the ethical dilemma of sentient machines

While deep in conversation with your childhood best friend, you catch glimpse of a frayed wire and grey hinge protruding from the side of their neck.

“I thought you knew,” they calmly remark at your gaze. Suddenly they swing the hinge open to reveal an intricate set of wires and flashing lights hiding behind what you thought was their face. “I’m synthetic, but I think and feel just as you do, so what difference does it make?”

What difference does it make?

Despite Facebook’s $7 million investment into a new artificial intelligence (AI) lab at McGill, AI is still far from matching human capabilities. Nonetheless, pop culture encourages viewers to ponder the controversy of the future of AI. HBO’s critically-acclaimed series, Westworld, explicitly wrestles with many of these same themes.

In the show, humanoid machines known as “hosts” are designed to entertain human customers in a life-sized, Western-themed, adult playground. The hosts are pre-programmed with memories and daily routines—convinced that their lives are as ‘real’ as any human’s. However, their memories reset daily, while guests—humans visiting the park—continue to use them for their own enjoyment.

As the show progresses, the line between hosts and guests blurs. Hosts begin to learn about their imprisonment and question their purpose. They are capable of everything the guests are, most notably the ability to suffer, which raises the question as to whether it is ethical to use the hosts as appliances for the guests’ pleasure.

Modern society grapples with the concept of equality among race, gender, and numerous other identity traits. Now, society must consider what ‘equality’ among those biologically created, and those engineered, means. As with most ethical questions, no answers are self-evident—but philosophy and cognitive science can help shed light on this dilemma.

An important insight comes from what Alan Turing coined the “Imitation Game”—now referred to as the “Turing Test.” Simply, if something can convince humans that it’s able to think, then they must assume it’s conscious.

A converse philosophical quandary, the “Problem of Other Minds,” highlights that an individual can never be certain that anyone else is thinking. Descartes’ “I think, therefore I am,” reassures said person that only they, themselves, can. Although humans believe that other humans are capable of thought because they’re similar, they can never be sure.

However, non-human animals also seem capable of thought, despite having brains that humans struggle to understand. With all of this in mind, if a machine with sensory receptors and a speaker system can maintain a conversation and respond to its environment, one must conclude that it’s conscious. Stevan Harnad, adjunct professor of psychology and cognitive science at McGill and the Université du Québec à Montréal, insists that composition of the cognizer is irrelevant.

“Why would anyone think that [controlling the AI] is any different than a human, because of the metal?” said Harnad.

He went on to assert that the current treatment of non-human beings, AI or otherwise, is more a matter of power than ethics.

Harnad also drew parallels between Westworld and the rodeos that just occurred for Montreal’s 375th anniversary. Bulls in the rodeo are used as a means to an end for human pleasure in the same way the show’s hosts are. Modern treatment of animals dates back to Descartes’ declaration of animals as “automatons”—devoid of feeling—as absurd as that may seem now. This sentiment creates a real-life Westworld in McGill’s own backyard.
History is filled with ethical failures that were only obvious in hindsight. The debate regarding how to treat non-human individuals isn’t a new one, but with the inevitable progress in AI, the time to discuss in thought experiments is shrinking. Conventional ethics assert that discrimination based on origin is unethical. AI demonstrates that human-like cognition can originate outside the realm of biology, and the treatment of animals shows how incoherent the valuation of non-human feeling can be.

If the question is a matter of flesh versus steel, and not consciousness versus unconsciousness, the answer is clear: There is no difference.

Share this:

One Comment

  1. Collee Kelly

    When people can not really treat others of different races or economic statuses as fully human and the other is seen “as less than”. More and more, there is no truth in the above text regarding Descartes or any moral code-If you see some others as less than yourself, you are no longer Christian, Muslim, Jewish, or Creator based. You are in the realm of Just making money or fear of losing money or some sort of power. My two thoughts.

Leave a Comment

Your email address will not be published.

*

Read the latest issue

Read the latest issue