U-Multirank: SIRIS Academic first impressions

article_19Maggio_2014

After the contentious launching of the European Comission-powered U-Multirank, press and stakeholders have raised their voice with both positive and negative comments on the system. We have also decided to put forward our preliminar views on U-Multirank in this post.

Last Tuesday 13 May, the European Commission officially launched the U-Multirank, an international higher education ranking that allows “users to compare the performance of institutions” out of 30 indicators “across five dimensions of higher education and research activities: a) teaching and learning, b) research, c) knowledge transfer, d) international orientation, and e) regional engagement”[1] and using 5 levels of marks, from A (Very good) to E (Weak).

The ranking’s interface enables the user to assess institutions as a whole or in a specific field, based on the indicators of the five performance dimensions. Thus, the system “does not produce a league table of institutions”[2], but endeavours to reflect “both the diversity of higher education institutions and the variety of dimensions of university excellence in an international context”[3]. This is driven by the project leaders’ notion that current rankings do not reflect critically the different roles that higher education and research institutions have for different groups of stakeholders. Moreover, this is very much related to the project leaders’ vindication, that is that a conceptual framework to assess the performance of institutions is needed[4].

Neither the Media nor stakeholders remained indifferent to the announcement of the €2 million EU-funded ranking, a contentious issue that by no means started last Tuesday. In fact, almost since the project’s inception, the ranking has been called both unnecessary and poorly designed by various public agents. For instance, in February 2013, as the Times of Higher Education points out in an article released this week[5], the League of European Research Universities, an association if 21 research-intensive universities, “voiced serious concerns about the project and withdrew its support.

 On the other hand, some enthusiastic voices have raised as well. For instance, Bernard Rentier, rector of the Belgian University of Liège, says the indicators used in the ranking are “relevant and enlightening”[6], while the European Students Union vice-chairman -Fernando Miguel Galán- highlights the “fairness” of the new ranking towards institutions until now disregarded in the current rankings.

The SIRIS Academic team has also received the ranking’s launch with great expectations,as our missions regularly require us to assess institutions, with or without the lens of the existing ranking. On this matter, see our past posts (http://www.sirislab.com/category/higher-education/) as well as our tools for the visualisation of the current major rankings (http://www.sirislab.com/tools/)  So, here our impressions on the U-Multirank:

1. Prospective students decision-making: complexity vs. simplicity

At a first glance, one notices that the U-Multirank’s interface, though nicely designed, is actually very complex. We are living in a context where prospective students are increasingly influenced by university league tables, as The Guardian puts forward[7]. Therefore, it appears that straightforward rankings such as the Times of Higher Education, QS, or ARWU rankings -despite their (as some public voices claim) inadequate indicators- are not going to be challenged by the U-Multirank in terms of decisions made by prospective students.

Nonetheless, the idea of providing students with tools that enable them to tailor their own rankings – such as the Spanish U-ranking– is very attractive and in line with the philosophy behind the European Commission’s approach to Higher Education[8].

2. Users still think on league tables

Human beings have a constant craving for hierarchies and ordinal rankings. We love to know who is first, even when we perfectly know that this does not make much sense. So it would be unfair to ask U-Multirank alone to cure the species’ obsession: it clearly has another purpose and this is very welcome. However, this is one of the reasons why the results might be misleading. For instance, the sample image below[9] shows a comparison table of all the institutions surveyed by the project in bachelor level of study. Intuitively, our eyes immediately set on the list on the left, where the “top scores” box is highlighted. ETH Federal Inst. of Tech. Zurich is, for our minds, the “best university of the world” in a bachelor level, though this is by no means the message the U-Multirank wants to show.

3. Data against the ropes

In the rankings context, data sources and indicators are tantamount to controversy. U-Multirank uses some of the same data and data sources than the traditional rankings do, combining them with “some new innovative ones, such as interdisciplinary programs, art-based research outputs, and regional engagement.

On the other hand, the Social Sciences and the Humanities are again underrepresented, despite this an open front for the U-Multirank team[10]. For instance, the citation index rate still uses data coming from databases that do not cover SSH disciplines[11].

Other aspect that makes us cast a doubt on the data used in the U-Multirank comes from SIRIS Academic personal experience with the French Higher Education panorama. “ParisTech” is ranked amongst French Institutions (www.paristech.fr) even though it is not an institution, but an association of 12 engineering schools. This mistake must be contextualised in the complexity of the present France HE situation. Nonetheless, ranking ParisTech is rather misleading for a student in search of accurate information.

Other mistakes are much less understandable, for example Institutions with inaccurate names, like  “ENS Mines Nantes”. Despite the actual name of the latter is École Nationale Supérieure des Mines de Nantes, the acronym “ENS” in France immediately points towards École Normale Supérieure, thus being “Mines Nantes” the official short name. Or the fact that, according to the ranking, for a doctorate in the general area of social sciences, business and law, the top score French institution is currently Telecom ParisTech, an engineering school.

On the other hand, and without looking for a rank, excellent places for doctoral studies in France certainly include, at the moment, University Pierre et Marie Curie, Université Paris-Diderot, ENS Ulm and University Paris-Sud. Having none of those institutions within the ranking makes for an extremely misleading result for any student who would currently try to find the best place for a doctorate. Such a gap makes one wonder whether the product has not been launched a little bit too soon.

Conclusion

Reviewing recent reactions to U-Multirank we can say that its launch has been loaded with contention. Leaving aside political controversies, the main problematic issues of the system are related to data and its sources. Elizabeth Redden, in an article from Insidehighered.com, has pointed out that the European University Association released a report on rankings indicating the “questionable feasibility” of some indicators used during the pilot phase of the U-Multirank. That indeed, in our opinion, appears to be true.

However, U-Multirank is a positive step forward in in the field of university ranking, as it tries to overcome the limitations present in conventional rankings. For instance, the idea of assessing universities as regional revitalisers is utterly important, as not every institution and its context is capable to be a global research-centered university. This sets benchmarks that are very much in line with the European Commission’s S3 Platform, that seeks to assist “EU countries and regions to develop, implement and review their Research and Innovation Strategies for Smart Specialisation”[12]. Indeed, the role of universities in regional economies was recently the center of a conference convened by the S3 Platform. Moreover, the idea of implementing a user-centered interface that allows comparative exploration, using multiple criteria is good and well-intentioned.

However, U-Multirank project failed to implement the system by using biased, partial, conventional and sometimes incorrect data.

Notes:

[1] Van Vught, F. and Ziegele, F. (2011), Design and Testing the Feasibility of a Multidimensional Global University Ranking, Consortium for Higher Education and Research Performance Assessment, p. 18.

[2] Grove, J. (2014), U-Multirank launched by EU commissioner, Times of Higher Education, 13 May. Available from: http://www.timeshighereducation.co.uk/news/u-multirank-launched-by-eu-commissioner/2013272.article. [Accessed: 15 May 2014]

[3]U-Multirank website. Section: About. Available from: http://www.u-multirank.eu/#!/about?trackType=home&sightMode=undefined

[4] Federkeil et al. (2012), Background and Design, in: Multidimensional Ranking: the Design and Development of U-Multirank, Springer.

[5] Grove, J. (2014), U-Multirank launched by EU commissioner, Times of Higher Education, 13 May. Available from: http://www.timeshighereducation.co.uk/news/u-multirank-launched-by-eu-commissioner/2013272.article. [Accessed: 15 May 2014]

[6] Rabesandratana, T. (2014), European Comission Unveils “Fairer” University Ranking System, Science Insider, 14 May. Available from: http://news.sciencemag.org/education/2014/05/european-commission-unveils-fairer-university-ranking-system. [Accessed: 16 May 2014].

[7] Adams, R. (2013), Prospective university students “swayed by league tables”, The Guardian. 16 May 2014. Available from: http://www.theguardian.com/education/2013/apr/05/university-league-tables. [Accessed: 16 May 2014].

[8] High Level Group on the Modernisation of Higher Education (2013), Report to the European Comission on Improving the quality of teaching and learning in Europe’s higher education institutions, European Comission, p. 14.

[9] Sample image retrieved from: http://www.u-multirank.eu/#!/forstudents/?simpleMapping=true&trackType=student&section=ranking&instutionalField=true&pref-3=1&pref-4=1&pref-5=1&pref-11=1&pref-13=1&pref-15=1&sortOrder=desc&sortCol=overallPerformance.

[10] Van Vught, F. and Ziegele, F. (2011), Design and Testing the Feasibility of a Multidimensional Global University Ranking, Consortium for Higher Education and Research Performance Assessment, p. 63.

[11] Hazelkorn, E. (2013), Europe Enters the College Rankings Game, Washington Monthly. October, Available from: http://www.washingtonmonthly.com/magazine/september_october_2013/features/europe_enters_the_college_rank046894.php?page=all. [Accessed: 17 May 2014].

[12] European Commission. Smart Specialisation Platform Section. Available from: http://s3platform.jrc.ec.europa.eu/activities.