Tag: rankings

U-multirank is growing

umultirankEarlier this month, the third U-Multirank was launched. In 2016/2017 edition U-Multirank is larger than ever, with 1300 institutions from 90 countries, and 13 subject areas compared with over 10700 study programmes covered. The idea of U-Multirank is to provide a multidimensional ranking, where it is possible to examine universities, according to results in the following areas: research performance, teaching and learning, knowledge transfer, international orientation and regional engagement. In total, 31 performance indicators are covered.

The main results show that there are different dynamics on different indicators. While in research performance/reputation, the “usual suspects” that dominate many of the one-dimensional rankings are also in the top, in the indicators with teaching and learning the picture is more complex. For teaching and learning, U-Multirank includes a survey of 105 000 students world wide at participating institutions. The 20 universities that obtained the highest satisfaction levels are from 9 different countries. In the press release, Dr. Frans van Vught and Dr Frank Ziegele who lead the project comment on this:The opinions of current students are – and should be – influential in helping tomorrow’s students decide where to study. Students want to find the university that’s best for them, according to their own preferences, and often look to their peers to learn from their experiences, especially in an area like learning and teaching where no one knows better than the students themselves.”  

U-Multirank 2015 published this week

umultirankOn Monday – the 2015 edition of the U-Multirank was published, following the first edition published in May 2014. While the 2014 edition featured 850 institutions, the 2015 version has expanded and now includes over 1200 institutions in 89 countries world wide. The press release highlighted that with 21,000 data scores on the institutional level and 37,000 on the field base level, U-Multirank is now the largest ranking of higher education institutions in the world.

Data is collected on 31 different indicators, and in addition to the options to compare institutions with similar other institutions, 17 ready made rankings are also provided, some of which (research and research linkages) also feature some of the “usual suspects” in the top, in particular when examining publications and citation rate.

However, some of the other indicators (i.e. economic involvement / regional dimension, or international orientation) show a very different set of top universities, and the usual US (and to some extent UK) dominance is challenged. For instance, University of Bergen is listed as number two in the international orientation ready made ranking, right behind Asian Institute of Technology in Thailand. In addition to this, there are also subject based rankings on various fields, and for instance in the area of medical education, Bern University was ranked as to be on the top. In general, most of the ready-made rankings show a rather diverse picture of institutions and countries.

News: University rankings as institutional strategy tools?

euaLast week, EUA published a new report on rankings  ‘Rankings in Institutional Strategies and Processes: Impact or Illusion?‘ (RISP) where the project examines in detail how rankings are used for institutional development across Europe. This report directly follows up on two earlier EUA reports on rankings that had primary focus on analyzing the methodology of rankings. Earlier this year, NIFU also published a report on the Nordic countries, where focus was on a comprehensive deconstruction of the rankings to identify what assures success, and to examine the impact of rankings on the leadership of research intensive universities in the Nordic region.

Data for the EUA report was gathered in various forms. An online survey was sent out to all EUA members (about 850). The survey yielded responses from 171 institutions in 39 countries, with a broad coverage of various European countries. 90% of the respondents came from instituions who are part of a ranking. Folloing up on the survey, a total of 48 meetings were conducted through six site visits to understand in more detail how instituions work with rankings, and a roundtable was organised with 25 participants from 18 European countries to create an arena for peer learning and sharing of experiences.

The main conclusion from this project is that rankings indeed do have an effect on institutional behaviour, but that this effect varies. 60% of those who answered in the survey replied that rankings are used in their institutional strategies – but the specific kind of use varied from examining certain indicators to using them in a comprehensive manner. Furthermore, it is highlighted that as many as 39% report that the results of rankings “to inform strategic, organisational, managerial or academic actions, and another third of respondents were planning to do so”. Unsurprisingly, rankings were widely used in marketing, but the respondents had also reported use in “the revision of university policies, the prioritisation of some research areas, recruitment criteria, resource allocation, revision of formal procedures, and the creation of departments or programme”.

News: U-Multirank launched yesterday

umultirankYesterday, 13th of May,U-Multirank finally reached its launch date and the data was presented to the public. The project has been funded with the European Union and has cost about 2 million euros. In this first edition, the ranking includes 850 institutions that can be compared in a multidimensional manner. The key argument has been to provide a system where institutional scores are not aggregated to one single score, but provide a means for users to rank institutions according to selected criteria.

In the press release, Androulla Vassiliou, European Commissioner for Education, Culture, Multilingualism and Youth commented on the launch: “I welcome the launch of this exciting new development in higher education. U-Multirank will enable students to make more informed decisions about where to study and give us a more accurate picture of how universities perform. We are proud of our world-class higher education, but we need many kinds of universities, catering for a wide range of needs; that means strong technical and regional universities just as much as outstanding research universities. U-Multirank highlights many excellent performers that do not show up in current, research-focused, global rankings – including more than 300 universities that have never appeared in any world ranking until now.

The core idea of U-Multirank is their focus on five key areas: research performance, teaching and learning quality, international orientation, knowledge transfer and regional involvement. Furthermore, the results can be viewed according to academic fields, the 2014 edition includes business studies, electrical engineering, mechanical engineering and physics. In 2015, psychology, computer science and medicine will be added to the list.

The press release highlighted that 95% of institutions receive an A score on at least one measures, showing that almost all institutions who participate have certain strengths in their institutional profile and that these strenghts can vary. At the same time, only 12% of the institutions receive more than 10 top scores, indicating also some vertical differentiation of institutions. Furthermore, the press release indicated that the participants come primarily from Europe: 62% are from Europe, 17% from North America, 14% from Asia and 7% from Oceania, Latin America and Africa.

Ranking and mapping – U-Multirank coming up

umultirankHave you heard about U-Multirank? It is a project to develop a multidimensional ranking tool where users can decide what indicator they want to rank the institutions by.

This means that no composite scores will be presented which would allow to create one single ranking of top 10 or top 100 of the best institutions. The main idea is that one can compare institutions with similar profiles to each other rather than create a single list, with the argument being that a single composite score does not sufficiently  take into account the various profiles of institutions.

The launch of U-Multirank has been anticipated for some time now, and while the official date is still not announced yet, it has been announced that U-Multirank will be available mid-May in 2014. The launch will be marked with a press conference in Brussels with the European Commissioner for Education, Culture, Multilingualism and Youth introducing the instrument and the researchers of the team providing their insights about the project – we will keep you posted about the development!

umap-exampleU-Multirank draws also on an accompanying project – the U-Map project, where the instrument has been available since August last year. U-Map has been in development since 2004 and is now fully functional. It is a mapping tool for higher education institutions in Europe and beyond (Listen also to our earlier Hedda podcast about U-Map where we talked to Franz Keiser and Elisabeth Epping). 

Hedda podcast: Student engagement with knowledge as a means to define quality

Episode 44 of our podcast series features Dr. Paul Ashwin from Lancaster University in the UK. In the podcast we talk about student engagement with knowledge as a key feature of quality in higher education, and he reflects on some of the key results from a three year long study on pedagogical quality and inequality in the UK.

Listen without the Flashplayer

Click here to download the Policy makers guide (pdf) that the research team has prepared based on the project results. 

View also the publications that the podcast is referring to:

Dr. Paul Ashwin  (Lancaster University)

Dr. Paul Ashwin
(Lancaster University)

Dr. Paul Ashwin is employed as a Senior Lecturer and Head of Department at the Department of Educational Research at Lancaster University in the UK. Earlier he has worked at the Institute for the Advancement of University Learning, University of Oxford and Newham College of Further Education. His key research interests are related to the relations between teaching-learning and knowledge-curriculum practices in higher education, as well as the implications of this for both policy and practice. He has also a keen interest on the methodological development of higher education studies in this area.

Times Higher Education ranking tool to compare US universities

THEBy now most of us have been flooded with different kinds of rankings. THE, Shanghai and QS with their composite scores that rely rather heavily on research have been accompanied by a multitude rankings on all possible and impossible indicators – from university ICT visibility, employability and reputation to the best place to party.

THE World University Rankings has now also compiled a set of US universities in terms of Average SAT score, acceptance rate as well as total number of students enrolled. On THE website, one can also find the tool with out of state tuition fees included. Aligning these results with the ranking from the annual THE rankings, some interesting cases emerge.

The institutional profiles of the top universities clearly differ. Institutions such as UCLA and UC Berkeley have a much more open acceptance rate amongst the top ranked US universities. While Harvard has an acceptance rate of 6%, then Berkeley accepts 22% and UCLA 27% of the applicants, indicating that not all of the institutions in the top are equally selective. The institutions also vary greatly in terms of size, but it is not the case that the most selective institutions are the smallest, the emerging picture is rather varied. Harvard has in fact over 27 000 students, whereas CalTech has only just over 2200 students, and an acceptance rate of 13%. The highest acceptance rate amongst the top universities is at University of Washington which in the THE ranking is on the 25th place but takes in a whole 58% of the applicants.

The very largest institutions, University of Phoenix, Ashford, and Arizona State University are also amongst those that either are not ranked, or do not perform particularly well in the composite ranking. Of the five largest, only Arizona State University is part of the THE ranking, on #146 position. While on a global scale this is a good position, the students they attract in the US clearly have much lower average SAT scores than the better ranked institutions as their acceptance rate is at 89%. At the other end one can find Georgetown, a rather selective institution that accepts 18% of the students, but is well behind in the rankings of other equally selective institutions, being in fact ranked under Arizona, on #160.

News: World University Rankings 2012-2013 published last week

timesLast week the most recent set of World University Rankings was published. So, the top 10 includes Caltech, Oxford, Stanford, Harvard, MIT, Princeton, Cambridge, Imperial College of London, UC Berkeley, and Chicago. In essence the same list than last year with just Oxford and Stanford changing their places. The first non-US/non-UK institution was ETZ Zürich on 12th place.

The best Asian university was University of Tokyo on 27th, and THE editor Phil Baty featured in his analysis Alan Ruby who argued that there is a general rise of Asian universities in the list, likely to be linked to the austerity measures in Western universities and the focus on excellence in a number of Asian countries which now is paying off. However, another analysis indicates that the good or better positioning in rankings is not indicative of increasing quality across Asia – for instance in the case of India there is a clear differentiation in terms of institutions and the few highly selective institutions provide few spillovers to the whole system.

The best Nordic university is Karolinska on 42nd place. In Norway, nation wide media wrote about the University of Oslo rising some 17 places – where the rector is commenting how this rise is due to a long term efforts to raise research quality. Odd words after last years “dramatic fall” – which was just as many places down. This indicates that in a two year perspective the position is about the same. But in those  two years this has created two kinds of news – the dedication to research and results on the one hand, and the dramatic fall on the other hand. And one can of course question how many changes there really have been over two years. But one could argue that University of Oslos concerns about falling under the 200 list can be seen as quite grounded in some kind of public perception, considering how the group under 200 in the THE analysis is calledthe best of the rest” or as “they might be giants…or were“.

New QS rankings by subject published last week

QSrankings2013Last week, the third edition of QS rankings by subject were launched. Alltogether, 2,858 universities were evaluated to create top 200 lists in 30 subject areas and include indicators both related to the academic community, employer feedback as well as citations as a means to represent research intensity and quality, and different subject areas have different weighing in terms of the indicators due to the varied internal structure of the disciplines.

The rankings include 30 subject areas in five groups:

  • Humanities: Philosophy, Modern Languages, Geography, History, Linguistics, and English Language & Literature
  • Life Sciences & Medicine: Medicine, Biological Sciences, Psychology, Pharmacy & Pharmacology, and Agriculture & Forestry
  • Social Sciences: Statistics & Operational Research, Sociology, Politics & International Studies, Law, Economics & Econometrics, Accounting & Finance, Communication & Media Studies, Education
  • Engineering & Technology: Computer Science & Information Systems, Chemical Engineering, CiviI & Structural Engineering, Electrical & Electronic Engineering, Mechanical, Aeronautical & Manufacturing Engineering
  • Natural Sciences: Physics & Astronomy, Mathematics, Environmental Sciences, Earth & Marine Sciences, Chemistry, Materials Sciences

An interesting bit of statistics is provided by QS site, where they have also published on which subject areas are most viewed on their site, as such perhaps creating a speculative account on, if not the importance of, then at least the focus on rankings in different subject areas. Perhaps unsurprisingly, there is an overweight of the so-called hard sciences, whether pure or applied. However, also fields from social science – such as Law, Econometrics and Psychology, feature in the top 10. Hwoever, all of the fields in the top 10 are fields that are known to be highly competitive and perhaps also areas where one can find more common standards of quality that is more easily quantifiable. Perhaps more surprisingly, the list ends with Geography that has been grouped under Humanities.

News: Times reputation rankings published – emerging six clearly on top

THEYesterday, the newest Times Reputation Rankings were published. Unlike other rankings, including the THE World University Rankings, this list is compiled purely based on reputation – that is the subjective perception by other renowned academics in six subject areas: engineering and technology; physical sciences; life sciences; clinical, preclinical and health; social sciences; and arts and humanities.

The results speak perhaps for themselves. The top six universities lead with a huge margin, and include Harvard University (USA), Massachusetts Institute of Technology (USA), University of Cambridge (UK), University of Oxford (UK), University of California, Berkeley (USA) and Stanford (USA). While these are also institutions that frequently are part of the top 10 in ranking based on other measures, the dominance in the reputation rankings is massive. In essence, this is also an exercise of creating a brand – these six universities are by no measure termed better with such a margin by other indicators and rankings, but somehow they have built a reputation that clearly distinguishes them from the rest.

THE article reviewing the latest reputation ranking results called these universities “super-brands”. However, one could also wonder what kind of impact the rankings themselves have had on this over the years, further institutionalising the position of the top institutions as the main players in the global higher education marketplace.  However, can the top at all be competed with, or is the Matthew effect just going to snowball over  time?

What does Stanford have that Princeton does not? While Princeton does by far not do badly on the reputation rankings, it is clearly lagging behind the first six in this ranking, while being in fact ahead of Cambridge in the latest World Rankings. Nevertheless, in terms of reputation it received barely half of what Stanford did on 6th place. Perhaps something for the marketing department to ponder about?