Cargando…

Professional standards in bibliometric research evaluation? A meta-evaluation of European assessment practice 2005–2019

Despite growing demand for practicable methods of research evaluation, the use of bibliometric indicators remains controversial. This paper examines performance assessment practice in Europe—first, identifying the most commonly used bibliometric methods and, second, identifying the actors who have d...

Descripción completa

Detalles Bibliográficos
Autor principal: Jappe, Arlette
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7170233/
https://www.ncbi.nlm.nih.gov/pubmed/32310984
http://dx.doi.org/10.1371/journal.pone.0231735
_version_ 1783523852799705088
author Jappe, Arlette
author_facet Jappe, Arlette
author_sort Jappe, Arlette
collection PubMed
description Despite growing demand for practicable methods of research evaluation, the use of bibliometric indicators remains controversial. This paper examines performance assessment practice in Europe—first, identifying the most commonly used bibliometric methods and, second, identifying the actors who have defined wide-spread practices. The framework of this investigation is Abbott’s theory of professions, and I argue that indicator-based research assessment constitutes a potential jurisdiction for both individual experts and expert organizations. This investigation was conducted using a search methodology that yielded 138 evaluation studies from 21 EU countries, covering the period 2005 to 2019. Structured content analysis revealed the following findings: (1) Bibliometric research assessment is most frequently performed in the Nordic countries, the Netherlands, Italy, and the United Kingdom. (2) The Web of Science (WoS) is the dominant database used for public research assessment in Europe. (3) Expert organizations invest in the improvement of WoS citation data, and set technical standards with regards to data quality. (4) Citation impact is most frequently assessed with reference to international scientific fields. (5) The WoS classification of science fields retained its function as a de facto reference standard for research performance assessment. A detailed comparison of assessment practices between five dedicated organizations and other individual bibliometric experts suggests that corporate ownership and limited access to the most widely used citation databases have had a restraining effect on the development and diffusion of professional bibliometric methods during this period.
format Online
Article
Text
id pubmed-7170233
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-71702332020-04-23 Professional standards in bibliometric research evaluation? A meta-evaluation of European assessment practice 2005–2019 Jappe, Arlette PLoS One Research Article Despite growing demand for practicable methods of research evaluation, the use of bibliometric indicators remains controversial. This paper examines performance assessment practice in Europe—first, identifying the most commonly used bibliometric methods and, second, identifying the actors who have defined wide-spread practices. The framework of this investigation is Abbott’s theory of professions, and I argue that indicator-based research assessment constitutes a potential jurisdiction for both individual experts and expert organizations. This investigation was conducted using a search methodology that yielded 138 evaluation studies from 21 EU countries, covering the period 2005 to 2019. Structured content analysis revealed the following findings: (1) Bibliometric research assessment is most frequently performed in the Nordic countries, the Netherlands, Italy, and the United Kingdom. (2) The Web of Science (WoS) is the dominant database used for public research assessment in Europe. (3) Expert organizations invest in the improvement of WoS citation data, and set technical standards with regards to data quality. (4) Citation impact is most frequently assessed with reference to international scientific fields. (5) The WoS classification of science fields retained its function as a de facto reference standard for research performance assessment. A detailed comparison of assessment practices between five dedicated organizations and other individual bibliometric experts suggests that corporate ownership and limited access to the most widely used citation databases have had a restraining effect on the development and diffusion of professional bibliometric methods during this period. Public Library of Science 2020-04-20 /pmc/articles/PMC7170233/ /pubmed/32310984 http://dx.doi.org/10.1371/journal.pone.0231735 Text en © 2020 Arlette Jappe http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Jappe, Arlette
Professional standards in bibliometric research evaluation? A meta-evaluation of European assessment practice 2005–2019
title Professional standards in bibliometric research evaluation? A meta-evaluation of European assessment practice 2005–2019
title_full Professional standards in bibliometric research evaluation? A meta-evaluation of European assessment practice 2005–2019
title_fullStr Professional standards in bibliometric research evaluation? A meta-evaluation of European assessment practice 2005–2019
title_full_unstemmed Professional standards in bibliometric research evaluation? A meta-evaluation of European assessment practice 2005–2019
title_short Professional standards in bibliometric research evaluation? A meta-evaluation of European assessment practice 2005–2019
title_sort professional standards in bibliometric research evaluation? a meta-evaluation of european assessment practice 2005–2019
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7170233/
https://www.ncbi.nlm.nih.gov/pubmed/32310984
http://dx.doi.org/10.1371/journal.pone.0231735
work_keys_str_mv AT jappearlette professionalstandardsinbibliometricresearchevaluationametaevaluationofeuropeanassessmentpractice20052019