Luke Stark, an assistant professor in the Faculty of Information and Media Studies at Western University, delivered a virtual talk on Jan. 18 about the history of affective computing and emotions in cybernetics, the science of communications and control in humans and machines.
The lecture was part of an annual speaker series titled “Disrupting Disruptions: Feminist and Accessible Publishing, Communications, and Technology,” organized by Alexandra Ketchum, a faculty lecturer at McGill’s Institute for Gender, Sexuality, and Feminist Studies (IGSF). Stark’s research focuses on the ethical, historical, and social impacts of artificial intelligence (AI) and machine learning, which are also the topics of his current book project, Ordering Emotion: Histories of Computing and Human Feelings.
Ketchum launched the speaker series in 2019. In an email to the The McGill Tribune, she noted that Stark’s talk was the 71st of the series.
“When I started the series, I never imagined how big it would become,” Ketchum wrote. “I wanted to showcase the work of scholars, artists, and people in industry whose voices are less often showcased when it comes to questions of technology. This means that every speaker in the series is a person of colour, queer, a woman, non-binary, or disabled (and many of the speakers in the series share multiple of these identities).”
The talk was given in ‘scenes.’ Scene one centred around Facebook’s reaction icons, titled “Where Did These Faces Come From, and Why Do They Matter?” In this scene, Stark discussed how the like button, a 2009 design choice, is an example of a method used by social media platforms to extract data on user expression.
“That’s what these reactions are, they are structured data about emotional expression,” Stark said. “In some cases, this data is structured by the user, is developed by the user; in some cases, it is collected without the user knowing or realizing.”
Scene two of the talk outlined how technology companies became interested in developing emotional AI and how AI understands emotions. Stark argues that computing technologies reinforce definitions of emotions that prioritize the body’s physiological reactions to emotions.
“Computing technologies are taking up much older ideas, and in some ways taking up ideas that have been largely discredited […] in biology, physiology,” Stark said. “These kinds of quantifiable, biophysical definitions of emotions have tended to either directly or indirectly justify emotion and emotional control as both a proxy and mechanism for maintaining social norms […] of colonialism [and] misogyny.”
In scene three, Stark discussed the misrepresentation of emotional AI by technology companies. He debunked Apple’s introduction of Memoji, animated emojis that follow the user’s facial expressions, as a camouflaged cover for the company’s extraction data to improve its facial recognition technology. Stark also noted that digital features often run rampant with biases, pointing to research by Lauren Rhue that discovered how facial recognition systems fail to accurately interpret the emotions of Black people.
In his concluding remarks, Stark quoted Michal Kosinski, a computational psychology and psychometry expert at Stanford University, that equated a smartphone to a “psychological questionnaire that we are constantly filling out, both consciously and unconsciously.”
Zoe Leousis, U1 Arts, attended the talk with the hopes of learning more about the technology she interacts with on a daily basis.
“It’s something you don’t really think about, how personal the targeting of certain platforms really is and how easily we can be swayed by it,” Leousis told the Tribune. “I would have never thought of the robot maid from The Jetsons as a mirror to the stereotypes being pushed by these companies.”