Cargando…

Tracking Replicability as a Method of Post-Publication Open Evaluation

Recent reports have suggested that many published results are unreliable. To increase the reliability and accuracy of published papers, multiple changes have been proposed, such as changes in statistical methods. We support such reforms. However, we believe that the incentive structure of scientific...

Descripción completa

Detalles Bibliográficos
Autores principales: Hartshorne, Joshua K., Schachner, Adena
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Research Foundation 2012
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3293145/
https://www.ncbi.nlm.nih.gov/pubmed/22403538
http://dx.doi.org/10.3389/fncom.2012.00008
_version_ 1782225377174749184
author Hartshorne, Joshua K.
Schachner, Adena
author_facet Hartshorne, Joshua K.
Schachner, Adena
author_sort Hartshorne, Joshua K.
collection PubMed
description Recent reports have suggested that many published results are unreliable. To increase the reliability and accuracy of published papers, multiple changes have been proposed, such as changes in statistical methods. We support such reforms. However, we believe that the incentive structure of scientific publishing must change for such reforms to be successful. Under the current system, the quality of individual scientists is judged on the basis of their number of publications and citations, with journals similarly judged via numbers of citations. Neither of these measures takes into account the replicability of the published findings, as false or controversial results are often particularly widely cited. We propose tracking replications as a means of post-publication evaluation, both to help researchers identify reliable findings and to incentivize the publication of reliable results. Tracking replications requires a database linking published studies that replicate one another. As any such database is limited by the number of replication attempts published, we propose establishing an open-access journal dedicated to publishing replication attempts. Data quality of both the database and the affiliated journal would be ensured through a combination of crowd-sourcing and peer review. As reports in the database are aggregated, ultimately it will be possible to calculate replicability scores, which may be used alongside citation counts to evaluate the quality of work published in individual journals. In this paper, we lay out a detailed description of how this system could be implemented, including mechanisms for compiling the information, ensuring data quality, and incentivizing the research community to participate.
format Online
Article
Text
id pubmed-3293145
institution National Center for Biotechnology Information
language English
publishDate 2012
publisher Frontiers Research Foundation
record_format MEDLINE/PubMed
spelling pubmed-32931452012-03-08 Tracking Replicability as a Method of Post-Publication Open Evaluation Hartshorne, Joshua K. Schachner, Adena Front Comput Neurosci Neuroscience Recent reports have suggested that many published results are unreliable. To increase the reliability and accuracy of published papers, multiple changes have been proposed, such as changes in statistical methods. We support such reforms. However, we believe that the incentive structure of scientific publishing must change for such reforms to be successful. Under the current system, the quality of individual scientists is judged on the basis of their number of publications and citations, with journals similarly judged via numbers of citations. Neither of these measures takes into account the replicability of the published findings, as false or controversial results are often particularly widely cited. We propose tracking replications as a means of post-publication evaluation, both to help researchers identify reliable findings and to incentivize the publication of reliable results. Tracking replications requires a database linking published studies that replicate one another. As any such database is limited by the number of replication attempts published, we propose establishing an open-access journal dedicated to publishing replication attempts. Data quality of both the database and the affiliated journal would be ensured through a combination of crowd-sourcing and peer review. As reports in the database are aggregated, ultimately it will be possible to calculate replicability scores, which may be used alongside citation counts to evaluate the quality of work published in individual journals. In this paper, we lay out a detailed description of how this system could be implemented, including mechanisms for compiling the information, ensuring data quality, and incentivizing the research community to participate. Frontiers Research Foundation 2012-03-05 /pmc/articles/PMC3293145/ /pubmed/22403538 http://dx.doi.org/10.3389/fncom.2012.00008 Text en Copyright © 2012 Hartshorne and Schachner. http://www.frontiersin.org/licenseagreement This is an open-access article distributed under the terms of the Creative Commons Attribution Non Commercial License, which permits non-commercial use, distribution, and reproduction in other forums, provided the original authors and source are credited.
spellingShingle Neuroscience
Hartshorne, Joshua K.
Schachner, Adena
Tracking Replicability as a Method of Post-Publication Open Evaluation
title Tracking Replicability as a Method of Post-Publication Open Evaluation
title_full Tracking Replicability as a Method of Post-Publication Open Evaluation
title_fullStr Tracking Replicability as a Method of Post-Publication Open Evaluation
title_full_unstemmed Tracking Replicability as a Method of Post-Publication Open Evaluation
title_short Tracking Replicability as a Method of Post-Publication Open Evaluation
title_sort tracking replicability as a method of post-publication open evaluation
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3293145/
https://www.ncbi.nlm.nih.gov/pubmed/22403538
http://dx.doi.org/10.3389/fncom.2012.00008
work_keys_str_mv AT hartshornejoshuak trackingreplicabilityasamethodofpostpublicationopenevaluation
AT schachneradena trackingreplicabilityasamethodofpostpublicationopenevaluation