Cargando…
Anchoring effects in the assessment of papers: An empirical survey of citing authors
In our study, we have empirically studied the assessment of cited papers within the framework of the anchoring-and-adjustment heuristic. We are interested in the question whether the assessment of a paper can be influenced by numerical information that act as an anchor (e.g. citation impact). We hav...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10065272/ https://www.ncbi.nlm.nih.gov/pubmed/37000889 http://dx.doi.org/10.1371/journal.pone.0283893 |
_version_ | 1785018070051323904 |
---|---|
author | Bornmann, Lutz Ganser, Christian Tekles, Alexander |
author_facet | Bornmann, Lutz Ganser, Christian Tekles, Alexander |
author_sort | Bornmann, Lutz |
collection | PubMed |
description | In our study, we have empirically studied the assessment of cited papers within the framework of the anchoring-and-adjustment heuristic. We are interested in the question whether the assessment of a paper can be influenced by numerical information that act as an anchor (e.g. citation impact). We have undertaken a survey of corresponding authors with an available email address in the Web of Science database. The authors were asked to assess the quality of papers that they cited in previous papers. Some authors were assigned to three treatment groups that receive further information alongside the cited paper: citation impact information, information on the publishing journal (journal impact factor) or a numerical access code to enter the survey. The control group did not receive any further numerical information. We are interested in whether possible adjustments in the assessments can not only be produced by quality-related information (citation impact or journal impact), but also by numbers that are not related to quality, i.e. the access code. Our results show that the quality assessments of papers seem to depend on the citation impact information of single papers. The other information (anchors) such as an arbitrary number (an access code) and journal impact information did not play a (important) role in the assessments of papers. The results point to a possible anchoring bias caused by insufficient adjustment: it seems that the respondents assessed cited papers in another way when they observed paper impact values in the survey. We conclude that initiatives aiming at reducing the use of journal impact information in research evaluation either were already successful or overestimated the influence of this information. |
format | Online Article Text |
id | pubmed-10065272 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-100652722023-04-01 Anchoring effects in the assessment of papers: An empirical survey of citing authors Bornmann, Lutz Ganser, Christian Tekles, Alexander PLoS One Research Article In our study, we have empirically studied the assessment of cited papers within the framework of the anchoring-and-adjustment heuristic. We are interested in the question whether the assessment of a paper can be influenced by numerical information that act as an anchor (e.g. citation impact). We have undertaken a survey of corresponding authors with an available email address in the Web of Science database. The authors were asked to assess the quality of papers that they cited in previous papers. Some authors were assigned to three treatment groups that receive further information alongside the cited paper: citation impact information, information on the publishing journal (journal impact factor) or a numerical access code to enter the survey. The control group did not receive any further numerical information. We are interested in whether possible adjustments in the assessments can not only be produced by quality-related information (citation impact or journal impact), but also by numbers that are not related to quality, i.e. the access code. Our results show that the quality assessments of papers seem to depend on the citation impact information of single papers. The other information (anchors) such as an arbitrary number (an access code) and journal impact information did not play a (important) role in the assessments of papers. The results point to a possible anchoring bias caused by insufficient adjustment: it seems that the respondents assessed cited papers in another way when they observed paper impact values in the survey. We conclude that initiatives aiming at reducing the use of journal impact information in research evaluation either were already successful or overestimated the influence of this information. Public Library of Science 2023-03-31 /pmc/articles/PMC10065272/ /pubmed/37000889 http://dx.doi.org/10.1371/journal.pone.0283893 Text en © 2023 Bornmann et al https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. |
spellingShingle | Research Article Bornmann, Lutz Ganser, Christian Tekles, Alexander Anchoring effects in the assessment of papers: An empirical survey of citing authors |
title | Anchoring effects in the assessment of papers: An empirical survey of citing authors |
title_full | Anchoring effects in the assessment of papers: An empirical survey of citing authors |
title_fullStr | Anchoring effects in the assessment of papers: An empirical survey of citing authors |
title_full_unstemmed | Anchoring effects in the assessment of papers: An empirical survey of citing authors |
title_short | Anchoring effects in the assessment of papers: An empirical survey of citing authors |
title_sort | anchoring effects in the assessment of papers: an empirical survey of citing authors |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10065272/ https://www.ncbi.nlm.nih.gov/pubmed/37000889 http://dx.doi.org/10.1371/journal.pone.0283893 |
work_keys_str_mv | AT bornmannlutz anchoringeffectsintheassessmentofpapersanempiricalsurveyofcitingauthors AT ganserchristian anchoringeffectsintheassessmentofpapersanempiricalsurveyofcitingauthors AT teklesalexander anchoringeffectsintheassessmentofpapersanempiricalsurveyofcitingauthors |