Cargando…

Establishing Institutional Scores With the Rigor and Transparency Index: Large-scale Analysis of Scientific Reporting Quality

BACKGROUND: Improving rigor and transparency measures should lead to improvements in reproducibility across the scientific literature; however, the assessment of measures of transparency tends to be very difficult if performed manually. OBJECTIVE: This study addresses the enhancement of the Rigor an...

Descripción completa

Detalles Bibliográficos
Autores principales: Menke, Joe, Eckmann, Peter, Ozyurt, Ibrahim Burak, Roelandse, Martijn, Anderson, Nathan, Grethe, Jeffrey, Gamst, Anthony, Bandrowski, Anita
Formato: Online Artículo Texto
Lenguaje:English
Publicado: JMIR Publications 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9274430/
https://www.ncbi.nlm.nih.gov/pubmed/35759334
http://dx.doi.org/10.2196/37324
_version_ 1784745304064524288
author Menke, Joe
Eckmann, Peter
Ozyurt, Ibrahim Burak
Roelandse, Martijn
Anderson, Nathan
Grethe, Jeffrey
Gamst, Anthony
Bandrowski, Anita
author_facet Menke, Joe
Eckmann, Peter
Ozyurt, Ibrahim Burak
Roelandse, Martijn
Anderson, Nathan
Grethe, Jeffrey
Gamst, Anthony
Bandrowski, Anita
author_sort Menke, Joe
collection PubMed
description BACKGROUND: Improving rigor and transparency measures should lead to improvements in reproducibility across the scientific literature; however, the assessment of measures of transparency tends to be very difficult if performed manually. OBJECTIVE: This study addresses the enhancement of the Rigor and Transparency Index (RTI, version 2.0), which attempts to automatically assess the rigor and transparency of journals, institutions, and countries using manuscripts scored on criteria found in reproducibility guidelines (eg, Materials Design, Analysis, and Reporting checklist criteria). METHODS: The RTI tracks 27 entity types using natural language processing techniques such as Bidirectional Long Short-term Memory Conditional Random Field–based models and regular expressions; this allowed us to assess over 2 million papers accessed through PubMed Central. RESULTS: Between 1997 and 2020 (where data were readily available in our data set), rigor and transparency measures showed general improvement (RTI 2.29 to 4.13), suggesting that authors are taking the need for improved reporting seriously. The top-scoring journals in 2020 were the Journal of Neurochemistry (6.23), British Journal of Pharmacology (6.07), and Nature Neuroscience (5.93). We extracted the institution and country of origin from the author affiliations to expand our analysis beyond journals. Among institutions publishing >1000 papers in 2020 (in the PubMed Central open access set), Capital Medical University (4.75), Yonsei University (4.58), and University of Copenhagen (4.53) were the top performers in terms of RTI. In country-level performance, we found that Ethiopia and Norway consistently topped the RTI charts of countries with 100 or more papers per year. In addition, we tested our assumption that the RTI may serve as a reliable proxy for scientific replicability (ie, a high RTI represents papers containing sufficient information for replication efforts). Using work by the Reproducibility Project: Cancer Biology, we determined that replication papers (RTI 7.61, SD 0.78) scored significantly higher (P<.001) than the original papers (RTI 3.39, SD 1.12), which according to the project required additional information from authors to begin replication efforts. CONCLUSIONS: These results align with our view that RTI may serve as a reliable proxy for scientific replicability. Unfortunately, RTI measures for journals, institutions, and countries fall short of the replicated paper average. If we consider the RTI of these replication studies as a target for future manuscripts, more work will be needed to ensure that the average manuscript contains sufficient information for replication attempts.
format Online
Article
Text
id pubmed-9274430
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher JMIR Publications
record_format MEDLINE/PubMed
spelling pubmed-92744302022-07-13 Establishing Institutional Scores With the Rigor and Transparency Index: Large-scale Analysis of Scientific Reporting Quality Menke, Joe Eckmann, Peter Ozyurt, Ibrahim Burak Roelandse, Martijn Anderson, Nathan Grethe, Jeffrey Gamst, Anthony Bandrowski, Anita J Med Internet Res Original Paper BACKGROUND: Improving rigor and transparency measures should lead to improvements in reproducibility across the scientific literature; however, the assessment of measures of transparency tends to be very difficult if performed manually. OBJECTIVE: This study addresses the enhancement of the Rigor and Transparency Index (RTI, version 2.0), which attempts to automatically assess the rigor and transparency of journals, institutions, and countries using manuscripts scored on criteria found in reproducibility guidelines (eg, Materials Design, Analysis, and Reporting checklist criteria). METHODS: The RTI tracks 27 entity types using natural language processing techniques such as Bidirectional Long Short-term Memory Conditional Random Field–based models and regular expressions; this allowed us to assess over 2 million papers accessed through PubMed Central. RESULTS: Between 1997 and 2020 (where data were readily available in our data set), rigor and transparency measures showed general improvement (RTI 2.29 to 4.13), suggesting that authors are taking the need for improved reporting seriously. The top-scoring journals in 2020 were the Journal of Neurochemistry (6.23), British Journal of Pharmacology (6.07), and Nature Neuroscience (5.93). We extracted the institution and country of origin from the author affiliations to expand our analysis beyond journals. Among institutions publishing >1000 papers in 2020 (in the PubMed Central open access set), Capital Medical University (4.75), Yonsei University (4.58), and University of Copenhagen (4.53) were the top performers in terms of RTI. In country-level performance, we found that Ethiopia and Norway consistently topped the RTI charts of countries with 100 or more papers per year. In addition, we tested our assumption that the RTI may serve as a reliable proxy for scientific replicability (ie, a high RTI represents papers containing sufficient information for replication efforts). Using work by the Reproducibility Project: Cancer Biology, we determined that replication papers (RTI 7.61, SD 0.78) scored significantly higher (P<.001) than the original papers (RTI 3.39, SD 1.12), which according to the project required additional information from authors to begin replication efforts. CONCLUSIONS: These results align with our view that RTI may serve as a reliable proxy for scientific replicability. Unfortunately, RTI measures for journals, institutions, and countries fall short of the replicated paper average. If we consider the RTI of these replication studies as a target for future manuscripts, more work will be needed to ensure that the average manuscript contains sufficient information for replication attempts. JMIR Publications 2022-06-27 /pmc/articles/PMC9274430/ /pubmed/35759334 http://dx.doi.org/10.2196/37324 Text en ©Joe Menke, Peter Eckmann, Ibrahim Burak Ozyurt, Martijn Roelandse, Nathan Anderson, Jeffrey Grethe, Anthony Gamst, Anita Bandrowski. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 27.06.2022. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.
spellingShingle Original Paper
Menke, Joe
Eckmann, Peter
Ozyurt, Ibrahim Burak
Roelandse, Martijn
Anderson, Nathan
Grethe, Jeffrey
Gamst, Anthony
Bandrowski, Anita
Establishing Institutional Scores With the Rigor and Transparency Index: Large-scale Analysis of Scientific Reporting Quality
title Establishing Institutional Scores With the Rigor and Transparency Index: Large-scale Analysis of Scientific Reporting Quality
title_full Establishing Institutional Scores With the Rigor and Transparency Index: Large-scale Analysis of Scientific Reporting Quality
title_fullStr Establishing Institutional Scores With the Rigor and Transparency Index: Large-scale Analysis of Scientific Reporting Quality
title_full_unstemmed Establishing Institutional Scores With the Rigor and Transparency Index: Large-scale Analysis of Scientific Reporting Quality
title_short Establishing Institutional Scores With the Rigor and Transparency Index: Large-scale Analysis of Scientific Reporting Quality
title_sort establishing institutional scores with the rigor and transparency index: large-scale analysis of scientific reporting quality
topic Original Paper
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9274430/
https://www.ncbi.nlm.nih.gov/pubmed/35759334
http://dx.doi.org/10.2196/37324
work_keys_str_mv AT menkejoe establishinginstitutionalscoreswiththerigorandtransparencyindexlargescaleanalysisofscientificreportingquality
AT eckmannpeter establishinginstitutionalscoreswiththerigorandtransparencyindexlargescaleanalysisofscientificreportingquality
AT ozyurtibrahimburak establishinginstitutionalscoreswiththerigorandtransparencyindexlargescaleanalysisofscientificreportingquality
AT roelandsemartijn establishinginstitutionalscoreswiththerigorandtransparencyindexlargescaleanalysisofscientificreportingquality
AT andersonnathan establishinginstitutionalscoreswiththerigorandtransparencyindexlargescaleanalysisofscientificreportingquality
AT grethejeffrey establishinginstitutionalscoreswiththerigorandtransparencyindexlargescaleanalysisofscientificreportingquality
AT gamstanthony establishinginstitutionalscoreswiththerigorandtransparencyindexlargescaleanalysisofscientificreportingquality
AT bandrowskianita establishinginstitutionalscoreswiththerigorandtransparencyindexlargescaleanalysisofscientificreportingquality