Sunday, November 05, 2017

Ranking debate: What should Malaysia do about the rankings?


A complicated relationship

Malaysia has had a complicated relationship with global university rankings. There was a moment back in 2004 when the first Times Higher Education Supplement- Quacquarelli Symonds (THES-QS) world rankings put the country's flagship, Universiti Malaya (UM), in the top 100. That was the result of an error, one of several QS made in its early days. Over the next few years UM has gone down and up in the rankings, but generally trending upwards with other Malaysian universities following behind. This year it is 114th in the QS world rankings and the top 100 seems in sight once again.

There has been a lot of debate about the quality of the various ranking systems, but it does seem that UM and some other universities have been steadily improving, especially with regard to research, although, as the recent Universitas 21 report shows, output and quality are still lagging behind the provision of resources.  

There is, however, an unfortunate tendency in many places, including Malaysia, for university rankings to get mixed up with local politics. A good ranking performance is proclaimed a triumph by the government and a poor one is deemed by the opposition to be punishment for failed policies.

QS rankings criticised

Recently Ong Kian Ming, a Malaysian opposition MP, said that it was a mistake for the government to use the QS world rankings as a benchmark to measure the quality of Malaysian universities and that the ranking performance of UM and other universities is not a valid measure of quality.

"Serdang MP Ong Kian Ming today slammed the higher education ministry for using the QS World University Rankings as a benchmark for Malaysian universities.
In a statement today, the DAP leader called the decision “short-sighted” and “faulty”, pointing out that the QS rankings do not put much emphasis on the criteria of research output.

According to the QS World University Rankings  for 2018, released on June 8, five Malaysian varsities were ranked in the top 300, with Universiti Malaya (UM) occupying 114th position."

The article went on to say that:


"However, Ong pointed to the Times Higher Education (THE) World University Rankings for 2018, which he said painted Malaysian universities in a different light.

According to the THE rankings, which were released earlier this week, none of Malaysia’s universities made it into the top 300.



Ong suggests that they should rely on locally developed measures.

“Instead of being “obsessed” with the ranking game, he added, the ministry should work to improve the existing academic indicators and measures which have been developed locally by the ministry and the Malaysian Qualifications Agency to assess the quality of local public and private universities”

Multiplication of rankings

It is certainly not a good idea for anyone to rely on any single ranking. There are now over a dozen global rankings and several regional ones that assess universities according to a variety of criteria. Universities in Malaysia and elsewhere could make more use of these rankings some of which are technically much better than the well known big three or four, QS, THE, The Shanghai Academic Ranking of World Universities (ARWU) and sometimes the US News Best Global Universities.

Dr. Ong is also quite right to point out the QS rankings have methodological flaws.  However, the THE rankings are not really any better, and they are certainly not superior in the measurement of research quality. They also have the distinctive attribute that 11 of their 13 indicators are not presented separately but bundled into three groups of indicators so that the public cannot, for example, tell whether a good score for research is the result of an increase in research income, more publications, an improvement in reputation for research, or a reduction in the number of faculty.

The important difference between the QS and THE rankings is not that the latter are focussed on research. QS's academic survey is specifically about research and its faculty student ratio, unlike THE's, includes research-only staff. The salient difference is that the THE academic survey is restricted to published researchers while QS's  allows universities to nominate potential respondents, something that gives an advantage to upwardly mobile institutions in Asia and Latin America.


Ranking vulnerabilities
All of the three well known rankings, THE, QS and ARWU now have  vulnerabilities, metrics that can be influenced by institutions and where a modest investment of resources can produce a disproportionate and implausible rise in the rankings.

In the Shanghai rankings the loss or gain of a single highly cited researcher can make a university go up or down dozens of places in the top 500. In addition the recruitment of scientists whose work is frequently cited, even for adjunct positions, can help universities excel in ARWU’s publications and Nature and Science indicators.

The THE citations indicator has allowed a succession of institutions to over-perform  in the world or regional rankings:  Alexandria University, Anglia Ruskin University in Cambridge, Moscow Engineering Physics Institute, Federico Santa Maria Technical University in Chile, Middle East Technical University, Tokyo Metropolitan University, Veltech University in India, Universiti Tunku Abdul Rahman (UTAR) in Malaysia. The indicator officially has a 30% weighting but in reality it is even greater because of THE’s “regional modification” that gives a boost to every university except those in the top scoring country. The modification used to apply to all of the citations but now covers half.

The vulnerability of the QS rankings is the two survey indicators accounting for  50% of the total weighting which allows universities to propose their own respondents. In recent years some Asian and Latin American universities such as Kyoto University, Nanyang Technological University (NTU), the University of Buenos Aires, the Pontifical Catholic University of Chile and the National University of Colombia have received scores for research and employer reputation that are out of line with their performance on any other indicator.

QS may have discovered a future high flyer in NTU but I have my doubts about the Latin American places. It is also most unlikely that Anglia Ruskin, UTAR and  Veltech will do so well in the THE rankings if they lose their highly cited researchers.

Consequently, there are limits to the reliability of the popular rankings and none of them should be considered the only sign of excellence. Ong is quite correct to point out the problems of the QS rankings but the other well known ones also have defects.


Beyond the Big Four


Ong points out that if we look at "the big four" then the high position of UM in the QS rankings is anomalous.  It is in 114th place in the QS world rankings (24th in the Asian rankings), 351-400 in THE, 356 in US News global rankings and 401-500  in ARWU.

The situation looks a little different when you consider all of the global rankings. Below is UM's position in 14 global rankings. The QS world  rankings are still where UM does best but here it is at the end of a curve. UM is 135th  for publications in the Leiden Ranking, generally considered by experts to be the best technically, although it is lower for high quality publications, 168th in the Scimago Institution Rankings, which combine research and innovation and 201-250 in the QS graduate employability rankings.

The worst performance is in the Unirank rankings (formerly ic4u), based on web activity, where UM is 697th.

The Shanghai rankings are probably a better guide to research prowess than either QS or THE since they deal only with research and, with one important exception, have a generally stable methodology. UM is 402nd overall, having fallen from 353rd in 2015 because of changes in the list of highly cited researchers used by the Shanghai rankers.  UM does better for publications, 143rd this year and 142nd in 2015.

QS World University Rankings: 114 [general, mainly research]
CWTS Leiden Ranking:  publications 135,  top 10% of journals 195 [research]
Scimago Institutions Rankings:  168 [research and innovation]
QS Graduate Employability Rankings: 201-250 [graduate outcomes]
Round University Ranking: 268 [general]
THE World University Rankings: 351-400 [general, mainly research]
US News Best Global Universities: 356 [research]
Shanghai ARWU: 402 [research]
Webometrics: overall 418 (excellence 228) [mainly web activity]
Center for World University Rankings: 539 [general, quality of graduates]
Nature Index: below 500 [high impact research]
uniRank: 697 [web activity]


The QS rankings are not such an outlier. Looking at indicators in other rankings devoted to research gives us results that are fairly similar. Malaysian universities would, however, be wise to avoid concentrating on any single ranking and  they should look at the specific indicators that measure features that are considered important.


Universities with an interest in technology and innovation could look at the Scimago rankings which include patents. Those with strengths in global medical studies might find it beneficial to go for the THE rankings but should always watch out for changes in methodology. 

Using local benchmarks is not a bad idea and it can be valuable for those institutions that are not so concerned with research but many Malaysian institutions are now competing on the global stage and are subject to international assessment and that, whether they like it or not, means assessment by rankings. It would be an improvement if benchmarks and targets were expressed as reaching a certain level in two or three rankings, not just one. Also, they should focus on specific indicators rather than the overall score and different rankings and indicators should be used to assess and compare different places.


For example, the Round University Rankings from Russia, which include five of the six metrics in the QS rankings plus others but with sensible weightings, could be used to supplement the QS world rankings.


For measuring research output and quality universities the Leiden Ranking might be a better alternative to either the QS or the THE rankings. Those universities with an innovation mission could refer to the innovation knowledge metric in the Scimago Institutions Rankings

When we come to measuring teaching and the quality of graduates there is little of value from the current range of global rankings. There have been some interesting initiatives such as the OECD's AHELO project and U-Multirank but these have yet to be widely accepted. The only international metric that even attempts to directly assess graduate quality is QS's employer survey.

So, universities, governments and stakeholders need to stop thinking about using one ranking as a benchmark for everyone and also to stop looking at the overall rankings. 

2 comments:

Shanaya said...

I would like to appreciate your work and would like to tell to my friends.

Shanaya said...

I would like to appreciate your work and would like to tell to my friends.