Blog

New Times for University Rankings?

University rankings have become a widespread phenomenon during the past decade. Even if criticised for their lack of transparency and methodological bias, these rankings have rapidly grown in influence. A recent independent auditing of university rankings issued few days ago culminates several years of work towards improving the reliability and the accountability of this sector. Whether this becomes a sign of new times for university rankings or not, it will depend on how these audits are themselves performed and how they are ultimately perceived by the users of the rankings.

The publication in 2003 of the “Academic Ranking of World Universities” by Shanghai Jiao Tong University created a true wave of discussions about the quality of higher education worldwide. Since then, the number of university rankings has grown rapidly. Today the most well-known rankings are produced by a series of private or semi-public organizations like the Shanghai Ranking Consultancy (SRC), Times Higher Education-Thomson Reuters, Quacqarelli-Symmonds (QS), CWTS Leiden, Taiwan Higher Education Accreditation and Evaluation Council, the Centre for Higher Education Development/die Zeit, or Reitor, among others. The European Commission and the OECD too, have launched their own ranking projects, namely, the European Multidimensional University Ranking System (U-Multirank), and the Assessment of Higher Education Learning Outcomes Project (AHELO) respectively.

University rankings measure very different things. Some rankings are league tables of a specific set of top universities measuring several indicators into a single index. Other rankings measure only research performance, whereas others use a more varied set of indicators classifying different dimensions of performance but without a single ranking. Other university rankings measure the visibility of universities of the web, whereas others measure peer reputation.
Something similar is to be found in the rankings of business schools, some of them measuring composite indicators for the entire school (like the Financial Times, or Princeton Review rankings), whereas others measuring specific MBA and executive education programs (like QS or The Economist rankings).

Almost since the creation of all these rankings there have been growing criticism regarding several important aspects. In its first Report on Global University Rankings and their Impact in 2011, the European University Association pointed at the fact that the rankings tend to be elitist, focusing only on the research-heavy, oldest and largest universities worldwide.

More problematic perhaps is the growing view that the methodologies of these rankings are not transparent (methodologies and data not – sufficiently – available) and use disputable data (i.e. self-reported data, lack of normalization of the data). Sometimes the methodologies have been changing rapidly, rendering virtually incomparable results of the same ranking through time.
Growing concerns about these matters were behind the launch in 2006 of the Berlin Principles on Ranking of Higher Education Institutions, by the International Ranking Expert Group (IREG). The IREG was created by the initiative of the UNESCO European Centre for Higher Education (UNESCO-CEPES) and the Institute for Higher Education Policy in Washington, DC.

The 13 Berlin principles is the most serious attempt to date to put some minimum standards for the elaboration of university rankings. They state among other things, that university ranking must “be transparent regarding the methodology used for creating the rankings”, “use audited and verifiable data whenever possible”, and “include data that are collected with proper procedures for scientific data collection”.

After several years of work with the procedure, the first audits have been published few days ago, in the IREG conference May 2013 in Warsaw. These audits give the permission to the specific rankings to use the label “IREG approved”, as a sign of quality control.

In the current jungle of university rankings, the efforts of setting up a series of basic principles and an independent scrutiny are highly welcomed. This is needed not only in order to set up clear standards for the production of the rankings. It is also needed in order to put a halt to an uncritical and simplistic use of these rankings.

The extent to which the IREG label will gain momentum will depend on two interrelated things. Firstly, on IREG’s ability to secure a consistent and trustworthy work, guaranteeing the independence and quality of its label. And secondly, on how the IREG label will be received by the ranking organizations themselves, and ultimately by the users of the rankings (students, parents, higher education leaders and politicians). In any case, it is important to keep in mind that each university is a unique site of learning embedded in its particular historical, cultural and socio-economic context, and therefore, much more than just a number in a ranking list.

Photo credit: http://www.flickr.com/photos/pagedooley/2933664439/

Advertisement

2 thoughts on “New Times for University Rankings?

  1. One issue that arises in ranking research universities is the choice of database to use for publication counts, citations counts, and productivity measures. The Web of Science (SCI and SSCI) contains only the most elite journals, so the more elite universities will always come out much stronger in an assessment drawing from the Web of Science. It follows that the most elite universities are disproportionately represented in WoS. Reputation-based universities, where researchers may publish in non-indexed journals, will lose out in the rankings simply because WoS does not index everything–in fact, appears to index less than 50 percent (and even less than that) of all scientific publications. This and several other features makes it more likely that elite universities will appear high in any ranking.

    • Thanks for the comment. Agree! The data sources on publications for most of the university rankings are limited and biased towards some specific abstracting indexes.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s