Navigation auf uzh.ch

Suche

Open Science

Rankings

The benefits and influence of rankings on the world of academia have long been the subject of controversial debate. Rankings generally focus on measurable output, which can have unintended consequences, for example leading universities to concentrate on increasing the number of publications instead of improving the quality of their content. Although rankings purport to comprehensively measure universities’ diverse achievements in teaching and research, they cannot do so as they reduce indicators to a score and focus on quantitative criteria.

As a result of these conflicting approaches, UZH has decided it will no longer provide data for the THE ranking.

The University of Zurich has actively campaigned for many years, both nationally and internationally, for a culture of openness in academia. Open Science stands for open exchange, transparency and reproducibility, and promotes high-quality, efficient and impactful research. Moreover, UZH is a signatory of the international Agreement on Reforming Research Assessment coordinated by the European Science Foundation, the European University Association and other organizations, in which the emphasis is on quality over quantity. The UZH leadership team is convinced that scientific quality should be the decisive factor in all research policy decisions.

Will UZH’s withdrawal from the THE Ranking affect its appeal for international students and researchers?

We recommend that all prospective students compare the contents and structure of study programs, and that interested researchers and potential partner institutions inform themselves about the research programs, academic culture and working conditions on offer at UZH. Finding the right course and university for you and successfully beginning your studies or research career at an institution always involves looking closely at the teaching and opportunities on offer before you take the plunge – this is far more important than any ranking. We therefore do not anticipate any negative impact on our international relationships.

Is the withdrawal from the THE ranking connected with the research assessment reform at UZH?

In 2022, UZH – together with more than 350 institutions from over 40 countries – signed the international Agreement on Reforming Research Assessment coordinated by the European Science Foundation, the European University Association and other organizations. The international agreement was established to initiate a global process to reform the way research achievements are assessed, with the aim of improving the quality and impact of research. The agreement lists 10 commitments to guide signatory institutions when making reforms internally.

By withdrawing from the THE ranking, UZH is taking an important first step toward implementing the recommendation to avoid rankings in research assessment. This commitment is intended to avoid the metrics used in international rankings of research organizations, which are inappropriate for assessing researchers, trickling down to research and researcher assessment.

Why is UZH not withdrawing from other rankings at the same time?

The requirements, conditions and methodology are not the same for all rankings. For some, universities have to actively provide data, while for others they are evaluated without having to actively participate. We are taking the time to assess the other rankings.

Why is UZH withdrawing from the Times Higher Education (THE) ranking?

The University of Zurich (UZH) has actively campaigned for many years, both nationally and internationally, for a culture of openness in academia. We are strongly committed to open science, which stands for open exchange, transparency and reproducibility, and promotes high-quality, efficient and impactful research. The benefits and influence of rankings on the world of academia have long been the subject of controversial debate. Some of the methods used to calculate rankings lack transparency and are controversial. Rankings focus on measurable output, which has unintended consequences, for example leading universities to concentrate on increasing the number of publications instead of improving the quality of their content.

Although rankings purport to comprehensively measure universities’ diverse achievements in teaching and research, they cannot realistically do so as they reduce indicators to a blunt score and focus on quantitative criteria. The rankings’ measurement of performance and quality is inadequate, they sometimes take the wrong aspects into account, and they run counter to universities’ strategic goals such as the promotion of open science practices. Within the current ongoing movement to reform research assessments, rankings are viewed critically.

Due to the divergence of UZH’s viewpoint from the Times Higher Education’s approach, we have decided to no longer provide data for the THE ranking. We are also considering whether to continue with other rankings such as Shanghai, QS and US News & World Report ranking. With this move, UZH is taking another step toward a more transparent, open, efficient, fair and inclusive research environment that empowers our researchers to conduct cutting-edge research. We are convinced that scientific quality should be the decisive factor in all research policy decisions, that open science practices make an important contribution to said scientific quality, and that rankings should not be allowed to have a negative influence in this regard.

What are the problems with rankings?

Rankings that measure the quality and performance of universities have long been controversial. Each ranking system defines a set of indicators which attempt to show various aspects of research and teaching, but ultimately the various facets are reduced to a score forming the basis for an institution’s ranking position. Such a score can never do justice to the multidimensionality of all of a university’s activities in teaching and research.

For the THE ranking, universities are obliged to provide data for the ranking calculation, involving a great deal of work for staff. The data collection and evaluation methodology on which the ranking is based are not completely transparent and comprehensible. The overall score is calculated mainly on the basis of quantitative indicators that focus on research performance. Quantifiable results such as the number of publications are given disproportionate weighting compared to other key teaching and research activities undertaken by universities for the good of society. Rankings give a measure of performance that is neither holistic nor verifiable: in the best case the picture is incomplete, in the worst case it is strongly distorted.