Off the Board, Opinion

It’s not just humans who can be biased

The tech industry has long been a demographically homogeneous place, and there has been a lot of conversation about how to make the industry more inclusive for people who don’t fit the stereotype of the Silicon Valley tech bro. However, making the products themselves more inclusive hasn't received as much public attention. In part, this can be attributed to the idea that mathematics, computer science, and software engineering are inherently logical and egalitarian. Equations don't have subconscious biases, after all—or so the thinking goes. This line of reasoning has two fatal flaws: The first is that programmers are notoriously bad at predicting what the code that they write will do, which means that software can often have unintended social consequences. The second goes back to a common saying in machine learning circles: "Garbage in, garbage out.” In layman’s terms, models are only as good as the data they’re based on. As such, it’s critical to collect data that reflects the diversity of the people that machine learning models are used to study.

For example, a machine learning model used to predict the likelihood of criminals to reoffend after jail time predicted that, all other inputs being equal, people of colour were more likely to reoffend than Caucasian individuals. In another case, image recognition software frequently misclassified black people as gorillas. Currently, the top 10 Google image results for "engineer” are all men in hardhats.

Even if software runs precisely according to its specifications, failure to consider the social ramifications of a new product or feature can result in painful consequences for users. Take Facebook's birthday notifications feature, for example. In theory, it's a convenient way to avoid the potential awkwardness of forgetting someone's big day. But when someone is grieving over the passing of a loved one, the last thing they want to see is a Facebook notification about the deceased’s birthday. Although this might seem obvious, it was not initially implemented when Facebook rolled out this feature. Now, Facebook allows users to memorialize accounts so that friends of deceased individuals no longer receive birthday reminders, but this feature is relatively new.

Models are only as good as the data they’re based on. As such, it’s critical to collect data that reflects the diversity of the people that machine learning models are used to study.

The performance of machine learning models are highly dependent on the data that they are trained on. At a high level, the way that facial recognition software is developed is by taking a classification algorithm, showing it a bunch of pictures of things that are faces, and things that are not faces, and saying “This is a face” or “This is not a face” accordingly. But if an algorithm is only shown white faces, then it will only learn that white faces are indeed faces, and although it might guess that people from other ethnicities also have faces because they look like white people, it will also be much more likely to misclassify them as “not faces.”

This is a real-world problem that Snapchat faced when it was working on its filters that use facial recognition. The filters simply did not work as well for people with darker skin, because the training data that Snapchat had used was mostly white people. Nobody thought to make sure that the images they were training their models on were fully representative of the diversity of human faces.

A lack of representative sampling has also been seen in clinical trials, where drugs are often tested only on men. Drugs that are shown to be safe and effective in clinical trials are sometimes found to be ineffective or even detrimental to the health of women who take them.

Scientific and technological developments can have major ramifications for society. Often, the effects of a product on its users are unanticipated by its creators. It’s critical that the scientists and engineers responsible for them take the time to consider how their work may affect people who are not exactly like them.

Clare is a U2 Math and Computer Science student, and a Web Developer at the McGill Tribune.

 

Share this:

Leave a Comment

Your email address will not be published.

*

Read the latest issue

Read the latest issue