Rethinking the system of research assessment in higher education in the digital era

Loading...
Thumbnail Image
Date
2020
Journal Title
Journal ISSN
Volume Title
Publisher
Business Systems Laboratory
Abstract
Around the world specific methods are employed to evaluate the quantity and quality of universities’ research output. Because of increased competition between institutions research assessment plays useful role in giving incentives to increase quality of research. As scientific literature suggests research output is a function of resources spent and the microeconomic incentive structure [1]. Countries that perform well and introduced the system of research assessment have methods to evaluate research output. They strengthened their systems by introducing improved incentives. Evaluations are important as incentives and et the same time l they provide data on the research activity within a country. If there is no transparent and objective way of examining research activity. Without such data it is not possible to see whether the research system is functioning and how it can be improved [1]. Nowadays in institutions of higher education that have scientists from many fields assessing the research performance requires from evaluators the aggregation of the performance measures of the various fields. Two methods of aggregation are applied that are based on: (a) the performance of the individual scientists or (b) the performance of the scientific fields present in the institution [2]. The choice depends on context and the objectives. The two methods creates differences in performance scores as well as rankings. Many countries have introduced evaluations of university research that reflects global demands for greater accountability. Revisiting research assessment procedures is a shared responsibility and requires a concerted approach uniting major actors such as researchers and universities, research funding organizations, policymakers and non-governmental organizations. In this process leading role can be played by The European University Association (EUA) since it is the representative organization of universities and national rectors’ conferences in 48 European countries. EUA plays a crucial role in the Bologna Process and in influencing European Union policies on higher education, research and innovation. The Expert Group and EUA Secretariat developed the EUA Roadmap on Research Assessment in the Transition to Open Science and launched an Expert Subgroup on Research Assessment in 2018 [10]. International collaborations through various networks and non-governmental organizations such as EUA should make considerable contribution to gather and share information, to initiate dialogue between key actors, to formulate good practice and make policy recommendations for the next step in revising the system of research assessment.
Description
Aled ab Iorwerth., (2005). Methods of Evaluating University Research Around the World. Department of Finance. Working Paper 2005‐04. March 2005 . https://www.fin.gc.ca/pub/pdfs/wp2005-04e.pdf Giovanni Abramo., Ciriaco Andrea D’Angelo. (2015) Evaluating university research: Same performance indicator, different rankings. Journal of Informetrics. Volume 9, Issue 3, July 2015, Pages 514-525 . https://doi.org/10.1016/j.joi.2015.04.002 Geuna, Aldo., Martin, Ben. (2003). University Research Evaluation and Funding: An International Comparison. Minerva. 41. 277-304. 10.1023/B:MINE.0000005155.70870.bd. Ghent University., Principles for the evaluation of research. https://www.ugent.be/en/research/research-ugent/research-strategy/research-evaluation.htm European University Association. (2019). EUA BRIEFING. Reflections on University Research Assessment Key concepts, issues and actors. Dr Bregt Saenen., Dr Lidia Borrell-Damián. April 2019 https://eua.eu/downloads/publications/reflections%20on%20university%20research%20assess ment%20key%20concepts%20issues%20and%20actors.pdf European Commission Working Group on Rewards under Open Science (2017). Evaluation of Research Careers fully acknowledging Open Science Practices. Rewards, incentives and/or recognition for researchers practicing Open Science. Luxembourg: Publications Office of the European Union, p. 20. Retrieved 25 March 2019, from: https://publications.europa. eu/en/publication-detail/-/publication/47a3a330-c9cb-11e7-8e69-01aa75ed71a1. DORA (2012). San Francisco Declaration on Research Assessment. Retrieved 12 March 2019, from: https://sfdora. org/read/. Hicks, D., Wouters, P., Waltman, L., de Rijcke, S., & Rafols, I. (2015). The Leiden Manifesto for research metrics. Nature, 520, pp. 429-431. Retrieved 12 March 2019, from: http://www.leidenmanifesto.org/. European Commission Expert Group on Altmetrics (2017). Next-generation metrics: Responsible metrics and evaluation for open science. Luxembourg: Publications Office of the European Union, pp. 8-14. Retrieved 7 December 2018, from: https://publications.europa.eu/en/publication-detail/-/publication/b858d952-0a19-11e7-8a35- 01aa75ed71a1. European University Association (2018). EUA Roadmap on Research Assessment in the Transition to Open Science. Brussels: EUA. Retrieved 12 March 2019, from: https://eua.eu/resources/publications/316:eua-roadmap-on-researchassessment-in-thetransition-to-open-science.html
Keywords
Research Assessment, Higher Education Conventional Metrics, Usaged-Based Metrics, Alternative Metrics, Next Generation Metrics
Citation
7-TH BUSINESS SYSTEMS LABORATORY INTERNATIONAL SYMPOSIUM. „SOCIO-ECONOMIC ECOSYSTEMS: Challenges for Sustainable Development in the Digital Era“. University of Alicante. Polytechnic school, January 22-24, pp. 268-272, 2020
Collections