/>

The right to be forgotten

How targeted advertising really predicts consumer patterns

Madison McLauchlan, Managing Editor

Last semester, I travelled to Toronto by train to see a concert. My friend and I stayed at a modest Airbnb in someone’s suburban basement to save some cash. When I arrived, I hopped in the shower to wash off the grime and sweat from the five-hour train ride and was greeted by a plethora of shampoos and conditioners to choose from. There must have been at least 10 different bottles. So I grabbed a bottle of volumizing Pantene, one that I’d never seen before, and lathered up. 

Emerging from the shower, I dried my hair and was astounded at how shiny and soft it felt. “Hey, this shampoo is really amazing,” I told my friend. I didn’t mention its brand, its scent, or its colour. All I said was “this shampoo.” 

Two days later, back in Montreal, I was scrolling through my Instagram feed when an advertisement for the exact shampoo, brand and type, came up. I immediately felt a jolt of fear—the kind of panic that takes hold when you realize that you changed your clothes in front of the window, or when your music starts playing out loud instead of through your headphones. Sure, I’d encountered targeted ads that were scarily accurate, or that seemed to intuit my wants or needs––it’s not unusual to mention a brand name one day only to have it appear as an advertisement the next. But to receive an ad for a shampoo that I’d only ever used once in my life and had never looked up on the internet—it felt like an invasion of not only my privacy, but my mind. If my phone was not listening to my conversations, then how had Instagram’s, and by extension, Facebook’s, insidious algorithm pinpointed the exact product I happened to like out of an array of products displayed to me at the Airbnb? 

I still don’t have an explanation as to why. Though it may have been a coincidence, this chilling experience spurred me onto a research frenzy into just how customized advertisements have become. By now, most college-aged students like me have been submitting their data, with or without their consent, to big tech companies like Facebook and Google for years. 

Location pin

Much of modern marketing is built upon behavioural advertising. In this model, companies and third-party platforms build a profile of you based on your internet activity. What you like, what websites you visit, what ads you click on, what you watch on Netflix—all of this information is collected and used to recalibrate the content shown to you. Even more invasive metrics, like cursor movement and how long you linger on a certain page, are considered fair game.

Typically, digital marketing agencies are hired as intermediaries that craft many of the ads that appear on our endless Facebook, Twitter, or TikTok scrolls. One such Montreal-based agency, Turko Advertising, works with clients like Bell, Narcity Media, and Iris. When I spoke to the founder, Remi Turcotte, he told me that marketing techniques play on emotional triggers to encourage sales. In scarcity marketing, for example, companies signal that an item is only available in limited quantities. Advertisers can also give out special offers or coupons to encourage customers to reciprocate in kind by making purchases.

“It’s about serving the right ad to the right person at the right time,” Turcotte said. “In a macro mindset, the idea is to create a message that is going to be appealing to the target customer [....] You want an ad experience that is optimal.” 

One way to personalize ads is to know where the customer is, and therefore, what products and stores are nearby. Tracking an individual’s location data is what allows apps like Waze, owned by Google, to suggest ads based on what they will encounter while travelling from point A to point B. With location services turned on, we are consenting to our every step being tracked for profit.

According to Renee Sieber, an associate professor in the Department of Geography and president-elect of the McGill Association of University Teachers (MAUT), prediction becomes more robust when companies amalgamate the data of many individuals in a given location and make decisions based on meta trends. 

“It is amazing how many apps demand location information from you whether or not it's obvious why they would need that information from you, ” Sieber said. In fact, tracking longer-term movement data of a group of users can lead to raking in more profit. 

If you’ve ever opened up Instagram in class, chances are that companies have used your geophysical and temporal data to optimize ads accordingly. If you’re a student, they might show you stationery, laptops, or other school supplies. Tracking doesn’t stop when you leave the classroom. Companies are already one step ahead of you and will figure out what you want next—fast food deals for lunch, for example. During Reading Week, when hordes of McGill students travel in the same short period of time, companies will start promoting ads for suitcases, vacation deals, or other relevant travel items. 

Personally, I don’t find the idea of sharing my location that disturbing; apps on iOS are typically legally required to notify me when my location is being tracked, with some exceptions. But I’d never understood the extent to which companies fuse that data together, layering individualized data points with the rest of the population in order to piece together the clearest picture of my personality. 

Tracking users’ movement is not only restricted to the physical realm. Nowadays, almost all websites use cookie tracking to save users’ log-in info, track their browsing history, and sell info to third-party companies. Cookies, which are small morsels of text data downloaded onto your computer when you access a website, are the main way your behaviour is tracked on the internet for ad personalization. Third-party cookies can even integrate data about your online shopping across multiple websites. 

Location pin

“This is how websites know that you were there and keep you logged in,” Turcotte explained. “This also helps third-party platforms and ad servers follow you because your navigational information is stored in a cookie.” 

The myth that our phones are listening to us is a cogent response to the age of surveillance, but it’s ultimately untrue. When companies can access your location and internet activity, they don’t need to listen to know everything about you—they already have everything they need to know. In fact, voice-activated smart devices, like Amazon’s Alexa or Apple’s Siri, only record audio after detecting the “wake word.” Jonathan Sterne, a professor and James McGill Chair in Culture & Technology, explained that even though our phones aren’t recording us, our voices still make their way into advertising infrastructure.

“[T]here are cases where companies actively listen to people,” Sterne wrote in an email to //The McGill Tribune//. “Every time someone calls in to a customer support line and hears that their call will be ‘recorded for quality assurance,’ their own voice is being recorded by the company, and likely profiled and added to a database.” 

Many of these voice profiling technologies are not only inaccurate, but also plagued by biases—speech recognition tools often fail to recognize English spoken with a non-Western accent as well as common dialects like African American Vernacular English (AAVE). Racial and ethnic bias is everywhere in tech, from the Anglo-centric way that keyboard letters are encoded to the mostly white faces that facial recognition software is trained on. Although tech companies have faced increasing scrutiny about their questionable commitments to equity, it remains difficult to tackle systemic problems when their leadership remains overwhelmingly white.

Sieber says that though most people think of artificial intelligence as a neutral tool, this is a misconception—the biases of our society trickle down into our technologies as well. And as digital marketing companies get better at integrating ads into social media feeds, it becomes more difficult to view ads with a critical lens. In fact, we are increasingly exposed to native advertisements, where content creators integrate ads into the middle of YouTube videos or podcasts without disrupting the flow of regular content. The conversational nature of such ads inherently exploits the trust we put into our favourite creators to have their viewers’ best interests in mind. While this kind of deception might be obvious to hyper-online young adults, it’s not necessarily clear to people who are less digitally literate.

“There is a digital native versus digital immigrant problem,” Sieber told me. “There are people who were born to the devices and the platforms and there are people who came to them [....] I think that people are aware of the amount of data that is being shared. The question is whether they care enough to do things like block [tracking].”

Location pin

“Ultimately, the problem here is corporate secrecy, and the fact that people are able to sign away so many of their digital rights in the name of convenience,” Sterne wrote. 

But in this modern age of capitalist conception, convenience is a very powerful thing. If people believe they benefit from advertisements anticipating their wants and needs, then the question of privacy violation is moot. Convenience is not the only thing companies are trying to sell—modern wellness culture is constantly striving for optimization in all its forms. It wants to sell you the best life possible—just look at the rise of mental-health apps, scheduling software, and expensive juicers. 

Our current situation may be unprecedented, but it’s not unanticipated. In 1974, sociologist Steven Lukes theorized about three dimensions of power, the last of which feels the most prescient for our age. In this third dimension of power, a group of people are under the control of another group and they acquiesce to this domination. Crucially, in this situation, they even believe that this type of domination improves their lives. 

“A lot of companies will say we're doing this not to spy on you, but to customize your [user experience] and therefore be able to sell you more products,” Sieber said. “Don't you want products that are specifically targeted to you?”

Intuitively, yes, but the answer will vary depending on who you ask. There are many people who are particularly wary of Big Tech’s surveillance and opt out of social media or using a smartphone at all—but this, too, reflects its own kind of privilege. The livelihood of an UberEats driver, on the other hand, relies on the locational, targeted advertising of our age. A low-income single mother may appreciate targeted ads for diapers on sale—maybe they even serve as a helpful reminder. Who gets to opt out of surveillance capitalism? Rich people have the right to be forgotten, but people working multiple jobs, older people unfamiliar with technology, and working immigrants may not have the luxury of free time to look into how their data is being shared and how to avoid giving it up. “The right to opt out of these things implies that you were fairly secure in what you're doing,” Sieber said. 

Facebook and Google may not just be exploiting your penchant for high-waisted jeans—they could be exploiting traumatic events in your life. Anything is fair game to these companies; they can give you ads about next steps to take after having a miscarriage, or how to cope with the death of a loved one. Racial and demographic biases are also built into marketing algorithms, so opportunities for employment or financial advancement may target certain groups and not others. Ads even cater to stereotypes of a certain group’s purported interests—women tend to get less political content on Facebook than men, for example. Political echo chambers of misinformation, as we’ve seen with the COVID-19 anti-vaccine movement, can push those who are teetering on the ideological brink into an abyss of misinformation. In many instances, right-wing lobbies have even funded ads spreading scientific inaccuracies.

As with many other forms of activism, individual actions like buying flip phones and deleting your Instagram account aren’t enough to bring down Big Tech because these companies will still hold the power to track and control everyone else. There must be limits and regulations set at the structural level to protect user privacy for those who want it, and greater transparency for those who are not privileged enough to be internet-literate. The trade-off between greater corporate surveillance and everyday convenience is currently placed on individual users rather than governments. “[I]t's on us to figure out whether we want to be inconvenienced,” Sieber said.

According to Turcotte, the era of granular behavioural profiling based on browsing data might be coming to an end. Indeed, Google announced that it would be phasing out third-party cookie tracking on its Chrome browser by the end of 2023. Thanks to consumer protection laws and bolstered data privacy laws, tech companies and advertising agencies like Turko will have less to work with going forward. It seems that mid-sized and small companies will have to pivot toward brand loyalty through email lists and personalized offers. 

“We’re going [away] from tracking user actions,” Turcotte said. “If you look further into the future, you won’t know whether a person [...] clicked on that page or went here.”

But with billions of dollars in their coffers, companies like Google and Apple will always be poised to win the battle of digital privacy and can take advantage of all the legal loopholes to get there. 

As for my own interaction with websites that exploit my data, I’m not sure where to go from here as a consumer. I went through all of the “interests” that Facebook had collected on me and removed every single one. I turned off tracking on Twitter, I disabled sharing between apps on my iPhone. But I still have trouble explaining why I truly care about companies using my data to sell me products. I don’t think that keeping my data off the grid will improve my life, but it calls into question the idea of free will. What does it mean to want something when all your wants and needs are fed to you? After all, I did end up buying that shampoo. 

What kind of person does that make me? I guess you’ll have to look at my targeted ads.

Illustrations by Jinny Moon, Design Editor