University rankings should come with “health warnings” and clear methodological information, but will continue to grow in influence and reach, despite criticism, says the editor of the Times Higher Education (THE) Rankings.
Phil Baty has moved to defend rankings after James Cook vice-chancellor Sandra Harding publicly declared a boycott of the THE rankings, and Adelaide University vice-chancellor Warren Bebbington argued university rankings are failing student consumers.
University of Southern Queensland vice-chancellor Jan Thomas has argued global rankings “ask only for conformity” when diversity is what’s needed in higher education, and Monash University’s Robert Nelson has bemoaned the influence that research income has upon many ranking indicators.
“No rankings are perfect,” Mr Baty said, but he argues critics need to be more specific about their concerns.
For example Mr Baty said Professor Harding’s argument that the rankings failed to properly recognise the strengths of more specialised, smaller universities overlooked the fact that it was the smaller, more specialised California Institute of Technology that ranked world number one in the latest THE ranking.
Mr Baty also said James Cook University had never participated in the THE rankings under THE’s partnership with Thomson Reuters.
He agreed that global rankings all tend to focus more heavily on research, but said THE had worked hard to make rankings more holistic. THE replaced its entire ranking system in 2010 after a thorough review.
“The THE rankings are the only rankings to fully reflect the unique subject mix of each and every institution across the full range of performance indicators and to take proper account of excellence in the arts, humanities and social sciences, so badly neglected by other rankings,” Mr Baty said.
Rankings agencies should all be working to make it clear that no ranking can be objective, Mr Baty said, because each one reflects the subjective view of the compiler about which indicators to use, and what weight to place on each indictor.
“All rankings should come with clear methodological information and relevant health warnings,” he said, so that people understand what the rankings do and do not cover.
But Adelaide University’s Warren Bebbington said most students don’t realise many of the rankings scarcely measure teaching or the campus experience at all.
“University rankings would have to be the worst consumer ratings in the retail market,” Professor Bebbington said last week.
“The international rankings must change, or student consumers worldwide will eventually stop using them.”
Mr Baty said there’s “not a chance” that rankings will be dumped by students, and that the Shanghai rankings were never conceived as a consumer rating.
He said Professor Bebbington’s comments add to the debate by helping ensure students better understand what global rankings actually measure, but students will continue to refer to rankings because reputation and prestige, which are shown in rankings, are crucial to the student’s decision.
Simon Marginson, professor of higher education at the University of Melbourne, said while the world would be a better place without rankings,the idea of a boycott was a “head in the sand” approach.
Instead, Professor Marginson advocates for improving current rankings, and relying more heavily on rankings with single indicators, rather than those that include surveys or multi-indicator rating systems that can be manipulated.
“You should have systems of improvement where everyone can improve, you shouldn’t be measuring people in general terms all the time.”
Nevertheless, Professor Marginson said rankings will remain popular, partly as a result of the “gold medal syndrome”.
“People love lists and cultural comparisons and in the global space they love comparing countries.”
And for now at least, there’s a general agreement that ranking universities on research is valid, Professor Marginson said.
“Research becomes prestige and prestige becomes the draw card.”