From Fame to Shame

The fraught history of McGill's University Rankings




Kyle Dewsnap

News Editor

In 2006, Tom Traves, then-president of Dalhousie University, formed a coalition of 11 schools, nine of which were among the top 15 research universities in Canada. The 11 presidents of the represented universities signed an open letter addressed to Maclean’s Magazine, denouncing their method of producing university rankings for their annual education issue. These schools refused to participate in the Maclean’s questionnaires, which gathered administrative data about schools’ student bodies and opinions of other institutions.
“We welcome public assessment of our work, and all our institutions devote significant resources to that end,” Traves’s letter reads. “However, it is truly hard for us to justify the investment of public funds required to generate customized data for your survey when those data are compiled in ways that we regard as over-simplified and arbitrary.”
McGill, though, continued to participate in the survey, and ranked first overall in the 2006 edition of the list. There is no mention of Traves’ letter in McGill’s celebratory press release.
University rankings are as popular as they are controversial: Mary Dwyer, the editor of Maclean's Magazine’s annual education issue, noted just how instrumental these rankings are with students who are picking out a post-secondary school.
“[In 1991,] there wasn't really any available [....] easy-to-access information on all the various universities [in Canada],” Dwyer said in an interview with The McGill Tribune.
“So, two editors here thought that Maclean's should do a university ranking, and it was a big hit. We even had to do a second printing. By the second year, they realized [that] there’s a real interest in this. At the same time, the universities had lots of concerns about [the methods involved in] that first ranking.”
While McGill has enjoyed the number one spot in Maclean’s for 15 years in a row, its placement in other rankings is less stable. Since 2012, McGill has been falling by an average of two spots per year in the Quacquarelli Symonds’s (QS) World University Rankings, the most popular international university ranking in the world.
Since 2012, McGill has been falling by an average of two spots per year in the Quacquarelli Symonds’s (QS) World University Rankings, the most popular international university ranking in the world.
Both national and international rankings have received criticism for their opaque methodology: For many students and administrators, it is hard to understand what information is incorporated in a university’s rank. As a result, Adrienne Cara, 2L Law at McGill, doubts whether people care about McGill’s consistent decline.
“People’s perceptions of a school don’t change overnight because rankings come out,” Cara said. “The alumni, research, and people coming from the university are what truly creates a school’s reputation. I went to a lower ranked school for undergrad, but have only encountered alumni with positive stories and employers with positive impressions.”
“People’s perceptions of a school don’t change overnight because rankings come out,” Cara said. “The alumni, research, and people coming from the university are what truly creates a school’s reputation. I went to a lower ranked school for undergrad, but have only encountered alumni with positive stories and employers with positive impressions.”
Deciding what makes a university worthy of a higher ranking is almost completely subjective. Because different publications have different ideas about what should go into scoring a school, rankings can contradict one another. Consider this year’s rankings, for example: 2019 is the ninth year in a row where McGill has fallen in the QS rankings, but McGill has also been climbing steadily in the Times Higher Education (THE) rankings over the past five years. Dwyer acknowledged that designing rankings can be contentious.
“Because [these are] general rankings, we have to look at things that can be applied to all the universities, and it has to be something that can be measured in a uniform way,” Dwyer said. “If you're a small liberal arts university, you're just not going to have the same kind of funding picture or research grants as McGill, or [the University of British Columbia], or [the University of Toronto (U of T)].”
International university rankings are a relatively recent phenomenon. The practice started in 2003 with the Academic Ranking of World Universities. According to Alexa analytics, the two most popular university rankings are now the QS and THE rankings. Both systems try to measure two key features of a school: The quality of the education that it offers its students and the amount of research that its faculty offers. However, while these two indicators can be statistically measured using employment and citation data, the rankings may leave out intangible factors that are crucial in determining a student’s quality of life on campus. Dwyer believes that the intangibles can only be measured with a survey.
“There's two things that we do that fall into [elements that can't be measured],” Dwyer said. “Those would be the [reputation] survey [and] the student satisfaction survey. I think those two indicators recognize that, yes, we can crunch all the data [we] want, but statistics can only tell part of the story. [Surveys are] just based on people's opinions. And some people might criticize it for that reason, and others [....] will praise it.”
In the late 2000s, reputational and student surveys were given significant weight in deciding ranks, much to the chagrin of university administrators. Maclean's was not the only publication that was subject to a boycott: In a 2007 meeting of the Annapolis Group of American Liberal Arts Colleges, a majority of the 80 university presidents in attendance said that they did not intend to participate in the U.S. News’ rankings in the future. In both Canada and the United States, both students and university staff were concerned that rankings forced universities to compete with each other, instead of working to improve the quality of education they offered to students. Madeline Wilson, Students’ Society of McGill University (SSMU) VP University Affairs, agrees with this sentiment.
“Frankly, I couldn't give less of a shit about McGill's international rank,” Wilson wrote in an email to the Tribune. “My aversion to international rankings stems primarily from the fact that [they] explicitly reinforce the idea that academia is rooted in competition. Education shouldn't be a competition.”
The tension between universities and ranking organizations reached a point where the United Nations Educational, Scientific and Cultural Organization (UNESCO) and the Institute of Higher Education Policy (IHEP) both published guidelines for rankers in an effort to protect universities. According to Dwyer, the IHEP forced education publication companies to reconsider their methods and motivations.
“I think [the] main point [of the university boycott] was that [universities] wanted to see third-party data and not self-reported data,” Dwyer said. “And, at that time, there were also more and more rankings both nationally and internationally. So, [the IHEP] published the Berlin Principles.”
The Berlin Principles were put together in 2006 by UNESCO-European Centre for Higher Education and the United States-based IHEP in an effort to make rankings more objective. While survey data could still be included in the rankings, publications would need to give the results less weight when scoring a university. Administrators favoured this more statistical approach, though it meant that students’ and employers’ opinions would ultimately factor less into rankings: In 1991, 25 per cent of the Maclean’s final score for a school was determined by its reputation; in 2019, it was only worth 15 per cent.
However, the drama surrounding rankings does not explain why McGill has been struggling in the international rankings. If a student seeking a four-year undergraduate degree entered McGill in 2015, they were accepted to the 25th-best school in the world, but graduated from the 35th-best. Simona Bizzozero, Head of Public Relations at QS, provided two reasons why McGill is floundering. “Regarding McGill University’s decline in the overall ranking, it’s relatively down to the drop in Faculty Students and Citations per Faculty,” Bizzozero wrote in an email to the Tribune. “McGill’s performance in reputational indicators has been quite stable, while the improvement in internationalization metrics hasn’t been strong enough to regain the lost ground.”
‘Faculty Students’ and ‘Citations per Faculty’ are QS jargon for two metrics: The amount of research produced by a school’s faculty, and the quality of education that students receive. The amount of research is calculated by taking the number of citations produced by an institution over five years and dividing it by the number of teaching staff at the school.
Since 2017, McGill has produced fewer citations in two popular journals than other universities, PLOS ONE and Scientific Report, likely impacting the school’s ranking. For many students, however, the amount of research produced by their university is not as important to them as the quality of the education they receive. McGill is struggling on that front, too.
QS, THE<\a>, and Maclean's all use student-staff ratios as indicators of a school’s commitment to teaching; it’s commonly believed that the fewer students to one faculty member, the higher quality of teaching those students will receive. However, since 2008, McGill’s student population has steadily increased by around 600 students per year, while the number of tenured faculty has remained consistent.
While this may not seem like a significant change, the number of students to each professor has increased from 16 to 18 in just 10 years. This change has had a massive impact on the student experience at a school of McGill’s size: In Fall 2008, there were 635 students registered in Physics 101. 10 years later, in 2018, that number rose to 728. There are only 601 seats in Leacock 132, meaning that extra students are forced to either sit on the floor or are relegated to overflow lecture rooms where the class is broadcasted.
However, James Martin, Associate Director of Internal Communications at McGill, notes that McGill is still proud of their relatively high rank, even factoring in recent declines.
“External rankings can help us promote our strengths and identify the potential areas we would want to improve,” Martin said in an email to the Tribune. “While positive rankings can contribute to a good international profile, McGill's international reputation is based on much more than that [....] employers across Canada and around the world look extremely favourably on McGill graduates when hiring. This is important for young graduates starting out in their careers, as well as those seeking to advance their careers in the future.”
However, McGill’s celebration of rankings leaves some students feeling cheated. Claire Grenier, U2 Arts, says that McGill’s fall in rank represents a growing lack of concern regarding student well-being.
McGill’s fall in rank represents a growing lack of concern regarding student well-being.
“I think McGill has fallen because they are not as adept at adjusting to student needs,” Grenier said. “McGill uses rankings to attract students but rarely keeps their promises after students pass through the gates.”
Wilson also shared a similar attitude about university rankings in general.
“The fact that donors, prospective students, and academics use these rankings to determine where they or their money will be going is particularly troubling, because it is in fact those institutions that don't have the resources or capacity to compete for a higher ranking that likely need the most support,” Wilson said.
Rankings exist to make complicated decisions simple—perhaps too simple.
Rankings exist to make complicated decisions simple—perhaps too simple. And no matter how much information goes into deciding which university comes out on top, reducing the quality of an institution into a single score causes a massive loss of information about the reality of the student experience at that university. Fortunately, at least one ranking-maker seems to acknowledge this.
“If you're looking at the rankings, it's important to not just look at the main chart and see what three schools are at the top,” Dwyer said. “That’s not going to tell you much.”