Autumn is the season for university rankings. We’ve already had the the Academic Ranking of World Universities, produced in Shanghai, the QS ranking and the Leiden Ranking. Now the Times Higher Education Rankings have got universities in a spin once again.
Does your university go up or down? How is it placed relative to its competitors? Which countries are climbing? Which universities are falling? There are lots of “good stories” for the press and the public every time the latest “definitive” chart is produced.
Before rankings, we had to rely on university leaders to tell us which universities were good and which weren’t. Of course, each claimed their own was the best for a range of different reasons. Rankings appeal because they appear to offer the publisher a simple objective measure they can use to cut through the hype.
The vice-chancellors would argue that it is no simple matter to compare universities with very diverse missions, funding regimes and histories. But if you cannot measure what you ought to measure, you measure what you can measure and then you claim that what you measure is a proxy for what you cannot measure – and hocus-pocus, you have a way to compare all the universities of the world. This is how all the rankers do it, THE included.
The first world ranking, ARWU, appeared in 2003. It used – and uses - the number of Nobel Laureates to have worked at each institution as a proxy for the quality of the teaching that goes on there.
Do we really believe that the quality of the teaching is zero at the roughly 16,000 universities of the world that have never produced a Nobel Laureate? This is perhaps one of the more striking examples of how the system of using proxies functions – or rather, does not function - but it is not a unique example.
THE claims to have a more nuanced approach but I would argue that rankings, as they exist today, do not measure teaching quality, service to society, and activities in arts, humanities and large parts of the social sciences, including business studies.
What they do is give a good representation of the parameters that have been used to evaluate the quality of research in the natural sciences, medical sciences and related fields for years. That means, if you wish to find out how the scientists themselves would rate and rank research universities primarily specialising in these fields, the ranking of the world universities is a good tool.
However, if the rankers claim that their ranking measure the overall quality of comprehensive universities with a broader mission, I tend to disagree with them. That is especially true if the bulk of the activity of the university is not in science and related fields.
Is it a coincidence that the top 10 is dominated by some of the richest universities in the world, many of whom generate a significant part of their income by charging high tuition fees?
Europe is home to a huge number of institutions that charge little to no fees, they make significant contributions to their regions, not least by offering higher education to local students, regardless of whether they can afford $30,000 or £9,000 a year to pay for it. You might argue that spreading knowledge is part of a university’s core mission but this is not something that is reflected in rankings.
Many universities have excellent records in both arts and science so it may very well be that the universities that come out in the top of the ranking are indeed in the top in all fields. By focusing so much on research, and often only some areas of research, rankings have the power to influence the direction of higher education. Universities and even whole countries strive to advance in the rankings and as a result risk focusing only on those areas that are valued by the rankers. How good is a university that has an exceptional record for citations if it has neglected teaching excellence and services to society in its quest to gather them?
If success in the rankings becomes the most important criteria in developing your university, you may end up with universities trying to copy the very best university of the kind that tops the ranking lists at the moment. However, given the economic and demographic realities in most countries of the world, they may never succeed. In the meantime, they may neglect some of the more important missions of a university as educating the next generation of citizens.
Rankings are here to stay. They will not go away. They give useful information about the university but must be treated with care and must not be viewed as the complete story.
So, congratulations to all of you who do well in the THE ranking. And to those vice chancellors who don’t find themselves at the realm of a “top university”, do not despair. You may be excellent in ways that are much harder to quantify.