Inadequacies exist in ranking systems - Education | The Star Online
X Close

ADVERTISEMENT

Inadequacies exist in ranking systems


WORLD university rankings have become a very interesting topic in recent years.

In only a short period of time, the lists that rank the world’s higher education institutions have grown in number and become more influential.

The rankings influence both the institutional strategy and national research policy of the universities.

The rankings have become popular in recent times because of increased globalisation.

The results are featured as news on university websites, presented on general news channels, newspapers and widely advertised in the social media.

Students use them and so do institutions for marketing, decision-making and benchmarking purposes. Decision-makers and politicians also use them.

Among others, the top four widely popular ranking systems are the Times Higher Education World University Rankings (THE Rankings), Quacquarelli Symonds (QS) World University Rankings, the Academic Ranking of World Universities (ARWU) by the Shanghai Ranking Consultancy (formerly known as the Shanghai Jiao Tong University’s Academic Ranking of World Universities) and the Webometrics Ranking of World Universities by the Consejo Superior de Investigaciones Científicas (CSIC) in Spain.

A widely accepted ranking system would help students when it comes to choosing the right institution to pursue a higher degree and get funding. It also helps new researchers carry out their research activities.

This in turn develops competition among the institutions.

Since 2004, Malaysia’s universities have consistently participated in (QS) World University Rankings.

The recently published 2018 rankings saw five Malaysian research universities break into the top 300 best universities with Universiti Malaya leading the way at 114th, a climb up for the third consecutive year.

It is recognition for the tireless efforts in coming up with research publications by the research universities.

However, many inadequacies exist in each of the ranking systems.

Firstly, do they reflect the quality of research? The rankings reflect the overall international impact of a university’s research.

It is an important aspect of quality, but it is not the same thing as quality.

The rankings give some idea of differences in quality, but are they measuring the same comparable things?

Try comparing the research budget of Harvard, with a university in Sweden for example.

It’s like comparing a sports car with a small family car and concluding which is best.

Secondly, the methodology is not clearly defined and does not always meet scientific standards.

A closer look at certain indicators and their weightage of QS World University Rankings shows an allocation of 40% for academic reputation, 20% on faculty student ratios, 20% on citation per faculty, 10% on employer reputation, and 5% each for international student and international faculty ratio.

The QS World University Rankings places too much emphasis on peer review which receives 40% of the overall score.

The rankings disfavour certain areas because the citation data is taken from scientific journals, databases and other publications that contain mainly publications in the English language and have less coverage on humanities and social sciences.

Malaysian researchers in the social sciences and humanities publish more in their native language – Bahasa Melayu.

They produce high-quality work, but it is not cited as often as work in the medical and science fields and so is scarcely visible in the rankings.

Thirdly, global performance factors like international collaboration among universities or scholars should be considered and evaluated more precisely in the ranking systems. Doing so is for better understanding of a university’s worldwide position.

Phil Baty, editor of the THE Rankings sees it (THE rankings) as a tool to help people gain an understanding of an institution or a national education system.

He is very well aware that rankings can be abused and points out that they should not be the only tool used. “The results must be seen in context and used in parallel with the institution’s own strategic plans.

Rankings can provide information that is useful for decision-making but are not intended to drive decisions. The best way to use them is to break the results down and look at individual aspects that are meaningful for you,” said Baty.

Finally, rankings have a distorting effect as they mainly use research-related measurements and largely disregard education and learning.

The rankings only measure a small part of what universities do, international publications and reputation. What about universities that give less priority to activity that does not help them to rise in the rankings, for example partnerships with the community, or teaching and learning? This is certainly worth a great rating score.

As a candidate of the Higher Education Leadership Academy (AKEPT) Young Scholars Programme (YSP), I had the privilege to join an educational trip to review Thailand’s Public and Private Higher Educational Institutions. What I found fascinating is that public universities in Thailand emphasise on community service and social responsibility. Community service and social responsibility are their core values!

I found this fascinating and worth adopting for our Malaysian universities. With an education system fascinated with grades, rankings, and examinations, the idea of community service is distant.

But think about it. How can we encourage our graduates to adopt this mindset? What happens if Malaysian graduates begin to adopt the mindset of contributing back to the community?

What will Malaysia’s socio-economic status be like with this renewed mindset?

A module for community service, knowledge transfer and social responsibility while preserving our Malaysian cultural diversity to be incorporated into Malaysian universities is an interesting proposition worth venturing into. It does seem that the university ranking phenomenon is here to stay.

I understand the importance of world university rankings as it defines what a university should be and how it performs.

But the fact remains that the underlying data is problematic and rankings have been allowed to influence research policy and the way universities formulate their work programmes.

Dr DONNIE ADAMS

Senior Lecturer

Institute of Educational Leadership

Universiti Malaya

Education , dr donnie adams , UM , rankings

ADVERTISEMENT