The newest Academic Ranking of World Universities (ARWU) was published yesterday by the The Center for World-Class Universities of Shanghai Jiao Tong University. Also known as the Shanghai ranking, this is one of the most well known and widely used rankings in the world, having been compiled for nine consecutive years by now. The very success of the ranking could also be the reason why the page publishing the ranking was down most of the day yesterday, likely due to usage overload.
As expected, there are no huge differences at the top of the list in comparison to last year. The top ten consists of the usual suspects, with small fluctuations either way. Harvard is yet again on top, as it has been throughout all the years and leads with a clear margin. If anything, during the nine years this ranking has been compiled, the gap from first to tenth university has marginally widened – in 2003 the tenth place has a score of 59.1, and that has decreased to 56.4 by 2011. Overall, during the nine years the institutions in top 10 have remained almost the same, aside Yale that was on 8th place and dropped out of top ten after that. One of the institutions that appears to be improving throughout years is MIT; whereas in the case of others, there appears to be some stability over years, and some (e.g. Cambridge) seem to be moving back and forth somewhere in the top5. (Click on image to see the overview of top 10 institutions from 2003 to 2011).
The relative stability of the list is also marked by the fact that it is only ten new entries to top 500 and three new entries to top 100. However, the press release does highlight the progress made by universities from the Middle East, and also the increasing stats of Chinese universities that now have 35 universities in the top 500.
The Shanghai ranking is often criticized for its emphasis on research quality and inability to capture all aspects of quality. However, it has been quite clear from the beginning that this ranking is focused on research quality, as the indicators used include: number of alumni and staff winning Nobel Prizes and Fields Medals, number of highly cited researchers selected by Thomson Scientific, number of articles published in journals of Nature and Science, number of articles indexed in Science Citation Index – Expanded and Social Sciences Citation Index, and per capita performance with respect to the size of an institution. It has nevertheless become one of the most cited ranking systems, in most cases also making it to mainstream media – either referred to as a success story of local universities, or a failure to make it to the top 500.
What sometimes seems to be forgotten is that making it to the top 500 can be a privilege for the extremely few in the diverse higher education landscape. The World Higher Education Database (WHED) has information on 15,000 higher education institutions in 180 countries that offer at least a post-graduate degree, and even this is not a comprehensive number – and the great majority of these are not even evaluated through this ranking (the press release indicates that “over 1000 get ranked and 500 get published in the ranking). This means that the rankings only capture in essence a very small group of institutions that are not representative for what higher education is, or should be.
This leads to the inevitable debates of rankings potentially leading to unnecessary mimicry of the successes of comprehensive and elite research universities and lack of focus on other aspects of universities that are perhaps not a core of a ranking system, but nevertheless a core function of universities. These criticisms have been met with the development of a multitude of other rankings, each with a certain niche and focus (some examples can be found here). To what extent higher education also contains aspects that are ‘unmeasurable’ in a quantifiable manner is of course up for debate.
Nevertheless, improving ones ranking appears to be a goal for many systems, leading to strategic action with the purpose of improving rankings positions. So – the new table is out and while one might disagree with the ranking methodology, it is widely argued that “rankings are here to stay”. Now – while they are indeed here to stay, everyone still cannot be Harvard. Or Stanford.
There has to be space to find a formula for success that purposefully falls outside the top 500 and quantifying the number of alumni with a Nobel prize. But how to go beyond the appeal of putting institutions in boxes where one is termed better in a seemingly objective way? There is no doubt that success in rankings has a powerful symbolic value. Higher education has always had a focus on reputation as a resource in the marketplace, but if competing for these symbolic resources starts to overshadow and conflict with the core mission(s) of university – then the slope ahead might be very slippery.