Cargando…

Auto-correlation of journal impact factor for consensus research reporting statements: a cohort study

Background. The Journal Citation Reports journal impact factors (JIFs) are widely used to rank and evaluate journals, standing as a proxy for the relative importance of a journal within its field. However, numerous criticisms have been made of use of a JIF to evaluate importance. This problem is exa...

Descripción completa

Detalles Bibliográficos
Autor principal: Shanahan, Daniel R.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: PeerJ Inc. 2016
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4824875/
https://www.ncbi.nlm.nih.gov/pubmed/27069817
http://dx.doi.org/10.7717/peerj.1887
_version_ 1782426144774029312
author Shanahan, Daniel R.
author_facet Shanahan, Daniel R.
author_sort Shanahan, Daniel R.
collection PubMed
description Background. The Journal Citation Reports journal impact factors (JIFs) are widely used to rank and evaluate journals, standing as a proxy for the relative importance of a journal within its field. However, numerous criticisms have been made of use of a JIF to evaluate importance. This problem is exacerbated when the use of JIFs is extended to evaluate not only the journals, but the papers therein. The purpose of this study was therefore to investigate the relationship between the number of citations and journal IF for identical articles published simultaneously in multiple journals. Methods. Eligible articles were consensus research reporting statements listed on the EQUATOR Network website that were published simultaneously in three or more journals. The correlation between the citation count for each article and the median journal JIF over the published period, and between the citation count and number of article accesses was calculated for each reporting statement. Results. Nine research reporting statements were included in this analysis, representing 85 articles published across 58 journals in biomedicine. The number of citations was strongly correlated to the JIF for six of the nine reporting guidelines, with moderate correlation shown for the remaining three guidelines (median r = 0.66, 95% CI [0.45–0.90]). There was also a strong positive correlation between the number of citations and the number of article accesses (median r = 0.71, 95% CI [0.5–0.8]), although the number of data points for this analysis were limited. When adjusted for the individual reporting guidelines, each logarithm unit of JIF predicted a median increase of 0.8 logarithm units of citation counts (95% CI [−0.4–5.2]), and each logarithm unit of article accesses predicted a median increase of 0.1 logarithm units of citation counts (95% CI [−0.9–1.4]). This model explained 26% of the variance in citations (median adjusted r(2) = 0.26, range 0.18–1.0). Conclusion. The impact factor of the journal in which a reporting statement was published was shown to influence the number of citations that statement will gather over time. Similarly, the number of article accesses also influenced the number of citations, although to a lesser extent than the impact factor. This demonstrates that citation counts are not purely a reflection of scientific merit and the impact factor is, in fact, auto-correlated.
format Online
Article
Text
id pubmed-4824875
institution National Center for Biotechnology Information
language English
publishDate 2016
publisher PeerJ Inc.
record_format MEDLINE/PubMed
spelling pubmed-48248752016-04-11 Auto-correlation of journal impact factor for consensus research reporting statements: a cohort study Shanahan, Daniel R. PeerJ Science and Medical Education Background. The Journal Citation Reports journal impact factors (JIFs) are widely used to rank and evaluate journals, standing as a proxy for the relative importance of a journal within its field. However, numerous criticisms have been made of use of a JIF to evaluate importance. This problem is exacerbated when the use of JIFs is extended to evaluate not only the journals, but the papers therein. The purpose of this study was therefore to investigate the relationship between the number of citations and journal IF for identical articles published simultaneously in multiple journals. Methods. Eligible articles were consensus research reporting statements listed on the EQUATOR Network website that were published simultaneously in three or more journals. The correlation between the citation count for each article and the median journal JIF over the published period, and between the citation count and number of article accesses was calculated for each reporting statement. Results. Nine research reporting statements were included in this analysis, representing 85 articles published across 58 journals in biomedicine. The number of citations was strongly correlated to the JIF for six of the nine reporting guidelines, with moderate correlation shown for the remaining three guidelines (median r = 0.66, 95% CI [0.45–0.90]). There was also a strong positive correlation between the number of citations and the number of article accesses (median r = 0.71, 95% CI [0.5–0.8]), although the number of data points for this analysis were limited. When adjusted for the individual reporting guidelines, each logarithm unit of JIF predicted a median increase of 0.8 logarithm units of citation counts (95% CI [−0.4–5.2]), and each logarithm unit of article accesses predicted a median increase of 0.1 logarithm units of citation counts (95% CI [−0.9–1.4]). This model explained 26% of the variance in citations (median adjusted r(2) = 0.26, range 0.18–1.0). Conclusion. The impact factor of the journal in which a reporting statement was published was shown to influence the number of citations that statement will gather over time. Similarly, the number of article accesses also influenced the number of citations, although to a lesser extent than the impact factor. This demonstrates that citation counts are not purely a reflection of scientific merit and the impact factor is, in fact, auto-correlated. PeerJ Inc. 2016-03-31 /pmc/articles/PMC4824875/ /pubmed/27069817 http://dx.doi.org/10.7717/peerj.1887 Text en ©2016 Shanahan http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, reproduction and adaptation in any medium and for any purpose provided that it is properly attributed. For attribution, the original author(s), title, publication source (PeerJ) and either DOI or URL of the article must be cited.
spellingShingle Science and Medical Education
Shanahan, Daniel R.
Auto-correlation of journal impact factor for consensus research reporting statements: a cohort study
title Auto-correlation of journal impact factor for consensus research reporting statements: a cohort study
title_full Auto-correlation of journal impact factor for consensus research reporting statements: a cohort study
title_fullStr Auto-correlation of journal impact factor for consensus research reporting statements: a cohort study
title_full_unstemmed Auto-correlation of journal impact factor for consensus research reporting statements: a cohort study
title_short Auto-correlation of journal impact factor for consensus research reporting statements: a cohort study
title_sort auto-correlation of journal impact factor for consensus research reporting statements: a cohort study
topic Science and Medical Education
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4824875/
https://www.ncbi.nlm.nih.gov/pubmed/27069817
http://dx.doi.org/10.7717/peerj.1887
work_keys_str_mv AT shanahandanielr autocorrelationofjournalimpactfactorforconsensusresearchreportingstatementsacohortstudy