Science & Technology

The social consequences of AI

Upon hearing the words “artificial intelligence” (AI), people tend to have mixed reactions. Science fiction’s favourite trope is AI taking over the world, and literature has been portraying such ideas since at least 1921, when Karel Čapek introduced the word ‘robot’ in his famous play R.U.R. The powerful applications of AI today often invoke thoughts of a utopian world where humans can do anything at the press of the button. As this narrative moves closer to reality, academics, computer scientists, and the general public are forced to ponder AI’s societal implications.

McGill’s Institute for Gender, Sexuality, and Feminist studies hosted a research talk on Feb. 20 as part of their Feminist and Accessible Publishing and Communications Technologies Speaker and Workshop series. The talk entitled ‘Why AI needs Collective Action’ was presented by Meredith Whittaker, the Co-Director of New York University’s AI NOW Institute and founder of Google’s Open Research Group.

Many are right to think that AI taking over the world is a far-fetched idea. As Whittaker pointed out, this is because AI isn’t a self-conscious machine; rather, it is an assembly of knowledge and skills in the hands of those in power.

“[AI] is a set of technical capabilities that are being recklessly integrated in social domains such as healthcare, education, criminal justice, and almost everything we can think of,” Whittaker said.

According to Whittaker, when we talk about AI, we must also confront issues of power imbalances. AI has become popular in recent years largely because modern hardware has the massive computational power necessary to handle large amounts of data.

“Major tech companies happen to have all of the necessary equipment [and] a lot of social data alongside a power structure designed to collect and record [these] data,” Whittaker said. “Current AI, the one touching our lives, is more or less from corporate companies, and only five companies in the West have these resources to develop and deploy AI [.…] If you ask any AI start-up, they will tell you that they are renting their computational power from one of the three major tech companies, […] Google, Microsoft, and Amazon, [who] resort to shady techniques to get [the] data.”

Whittaker explained how current AI data collection practices can violate privacy. For example, HireVue uses an algorithm to predict the best potential hire, a shoplifter prediction software used by stores in Japan analyzes people’s behaviours to predict whether they will commit a crime, and Amazon Ring has partnered with police to essentially make a privatized surveillance system. As there are no current checks on this power, Amazon can push new features without regulation.

“Because [these technologies are] primarily developed by private companies, socially significant decisions are made behind a veil of corporate secrecy,” Whittaker said. “Most of what the world does know about AI is through corporate marketing and PR departments [.…] The truth is hidden from the public and even the workers, [and] in a lot of cases, what we know is through investigative journalists and whistleblowers.” 

Whittaker, who previously worked at Google, was one of the organizers of the 2018 Google walkouts.

“As workers, we didn’t have access to a lot of information, and there was a whole network we had to set up,” Whittaker said. “Problems of bias and justice are fundamental issues and require us to examine structures of power and look at who is benefitting and who is harmed.”

Whittaker explained that AI can be used by people in power to isolate or discriminate against certain groups of people in situations ranging from hiring employees to being labelled a criminal. 

“No one is [being asked] to stop doing science, but the system requires change,” Whittaker said. “We need to add starting collective addition at the front of our AI […], a kind of system able to check the power of a tech before it’s too late.”

Share this:

Leave a Comment

Your email address will not be published.

*

Read the latest issue

Read the latest issue