Ask a Scientist, Science & Technology

Combatting pseudoscience with reason and rationality

Albert Einstein once said that “two things are infinite: The Universe and human stupidity; and I’m not sure about the Universe.” Although a quick fact-check shows that the acclaimed physicist might not have actually said this, the essence of the statement remains clear—there are no limits to human folly. Pseudoscientific beliefs and mistrust in science are a testament to this statement, yet the subject remains a complicated and multifaceted problem.

Pseudoscience is a body of “knowledge” that is presented as scientific yet is not confirmed by rigorous scientific methods such as the criterion of falsifiability. As a branch of science denialism and misinformation, pseudoscience includes ideologies such as evolution denialism, opposition to vaccines, and flat-Earthism.

While some pseudoscience fields pretend to be scientific (like homeopathy), others are built on denying well-established facts. One infamous example is the denial of the fact of evolution.

“People who disbelieve [evolution] often do so because it clashes with their beliefs, they experience cognitive dissonance,” said Jonathan Jarry, science communicator at McGill’s Office for Science and Society (OSS) and co-host of the Body of Evidence podcast. “So how do they resolve this? The easiest way is to explain science as false—they maintain their pre-existing belief [….] Our brain is not really a science adjudicator—it’s a lawyer that wants you to win every time in order to save face, and it takes effort to move away from this compulsion.”

While Earth’s spherical shape is more obvious than the fact of evolution, there are flat-Earthers who claim that satellite images are fake. And these are not isolated individuals exclaiming their disbelief: Entire communities have formed because they share the same sense of scepticism. Flat-Earthers even have their own website where they sell maps of flat Earth. 

“Flat-Earthers really exemplify the main characteristic of grand conspiracy believers—they may not agree on what the truth is, but they all agree that the mainstream narrative is wrong,” Jarry said. “Sometimes, it’s just a lack of expertise to make sense of what is good information and what isn’t. Sometimes, it’s about emotions, about what feels right. And very importantly, there’s a social component—if you look at the flat-Earthers, they’re a part of a community.”

Jarry further suggested that what unites believers in conspiracy theories is the illusion of having shared, secret knowledge that the rest of society cannot access. While beliefs like evolution denialism and Flat-Earthism have been around for centuries, the Internet has exacerbated the phenomenon of misinformation.

“Nonsense has always been with us, but now we have bloggers with millions of followers, despite them having no relevant scientific education,” said Joe Schwarcz, the director of McGill’s OSS and professor of chemistry, in an interview with the Tribune. “They have learned the language of science, they’ve clothed themselves in the garb of science, and they can sound very appealing.”

The enormous amount of information available on the Internet often makes it difficult to distinguish reliable sources from fraud, especially if pseudoscientific claims are presented with a veneer of professionalism—something that Schwarcz discusses often on his podcast, The Dr. Joe Show.

“As soon as [scientists] hesitate, when we have to sprinkle our language with ‘but’s and maybe’s,’ [pseudoscience promoters] jump in there,” Schwarcz said.

Professor Samuel Veissière, co-director of the Culture, Mind, and Brain program at McGill and associate professor in the Department of Anthropology, added that research into the propagation of pseudoscience has become a hot research topic itself.

“Mechanisms of information polarization online have been extensively studied in cognitive science: An overabundance of information, paired with a climate of low trust in the collective, tends to create cognitive bottlenecks for evolutionarily ‘cheap’ information like threat, moral outrage, social affiliation, and mass herding,” Veissière told the Tribune

Yet, distinguishing science from pseudoscience might not always be easy—as a consumer of information, at some point, you have to trust somebody. According to Veissière, extensive research in cognitive science has shown that when it comes to decisions about who and what to trust, the source of information is often more important to readers than the content.

This presents an issue when individuals trust a news source or a content creator on the Internet and believe them regardless of whether the information conveyed seems reasonable. Even traditionally trustworthy sources, though, must be questioned.

A good example is Andrew Wakefield’s 1998 paper in The Lancet, which linked autism to measles, mumps, and rubella vaccines. Though Wakefield was later found to have falsified data, the suspicious attitude towards vaccines grew in popularity—as evidenced by the COVID-19 pandemic—proving that scientific merit is also influenced by the publisher’s reputation.

Wakefield’s fraudulent publication also showed the vulnerable side of peer review and demonstrated the need to devise ways to ensure replicability and integrity in research. According to Jarry, while some incentives exist to replicate previous research, young researchers are primarily encouraged to conduct original investigations rather than reproduce earlier studies. 

“While we in the world of science worship at the altar of peer review, peer review is not the end goal,” Schwarcz said. “When a reviewer is asked to review a paper, they have to assume that the data presented was properly gathered and that the paper was honestly written because the reviewer cannot redo the work—[they] have to assume legitimacy.”

Yet, even when valid scientific results are presented, they are not always communicated effectively. Complicated terminology is one reason for this: beta-D-fructofuranosyl-alpha-D-glucopyranoside might sound like a toxic compound, but it is actually the International Union of Pure and Applied Chemistry name for sucrose (sugar). 

“[Scientific] jargon can be twisted by people who either don’t know how to interpret it or who want to lie to the public to make [a] paper say things that [it] didn’t [mean to] say, and most people are not going to check the paper itself,” Jarry said. 

While more robust science education in schools is one way to combat science misrepresentation, deeper socio-historical reasons often lead to science mistrust in marginalized communities, such as Indigenous and Black communities in Canada. For a long time, “science” was used to legitimize white supremacy and inflict violence on Black and Indigenous peoples, as well as other people of colour. A notorious example of this is the Tuskegee experiment in the United States from 1932 to 1972, during which the United States Public Health Service and the Center for Disease Control and Prevention actively prevented Black men from getting syphilis treatment so that researchers could study the natural development of the disease. 

“Science can be associated with dominant groups and symbols that carry a strong connotation of aggression,” Veissière wrote. “Science ‘skepticism’ in these cases is a maladaptive, but understandable[,] protective mechanism.”

Veissière added that effective strategies to restore marginalized communities’ trust in science would be to “recruit local leaders and trusted community members who can act as ‘influencers’ to help people gain trust.”

When working to educate and build trust, scientists must be realistic about what communities are interested in listening to their arguments. While attempts to debate anti-vaxxers or flat-Earthers are futile, there are better approaches to combat science misrepresentation.

“We have to focus on a rather large segment of the population who really have questions, who legitimately don’t know, because there is so much information and misinformation,” Schwarcz explained. “We have to concentrate on [those] who [are] willing to be convinced with proper scientific arguments.” 

Before trying to convince people, the first step is much simpler: We need to listen.  

“Do more listening first—if you want to convince somebody, you have to earn their trust,” Jarry said. “Earning their trust starts with listening to what they have to say and showing them that you care about them as a person and about what they think.”

Share this:

Leave a Comment

Your email address will not be published.


Read the latest issue