Cargando…

An empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014–2017)

Serious concerns about research quality have catalysed a number of reform initiatives intended to improve transparency and reproducibility and thus facilitate self-correction, increase efficiency and enhance research credibility. Meta-research has evaluated the merits of some individual initiatives;...

Descripción completa

Detalles Bibliográficos
Autores principales: Hardwicke, Tom E., Wallach, Joshua D., Kidwell, Mallory C., Bendixen, Theiss, Crüwell, Sophia, Ioannidis, John P. A.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: The Royal Society 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7062098/
https://www.ncbi.nlm.nih.gov/pubmed/32257301
http://dx.doi.org/10.1098/rsos.190806
_version_ 1783504486511149056
author Hardwicke, Tom E.
Wallach, Joshua D.
Kidwell, Mallory C.
Bendixen, Theiss
Crüwell, Sophia
Ioannidis, John P. A.
author_facet Hardwicke, Tom E.
Wallach, Joshua D.
Kidwell, Mallory C.
Bendixen, Theiss
Crüwell, Sophia
Ioannidis, John P. A.
author_sort Hardwicke, Tom E.
collection PubMed
description Serious concerns about research quality have catalysed a number of reform initiatives intended to improve transparency and reproducibility and thus facilitate self-correction, increase efficiency and enhance research credibility. Meta-research has evaluated the merits of some individual initiatives; however, this may not capture broader trends reflecting the cumulative contribution of these efforts. In this study, we manually examined a random sample of 250 articles in order to estimate the prevalence of a range of transparency and reproducibility-related indicators in the social sciences literature published between 2014 and 2017. Few articles indicated availability of materials (16/151, 11% [95% confidence interval, 7% to 16%]), protocols (0/156, 0% [0% to 1%]), raw data (11/156, 7% [2% to 13%]) or analysis scripts (2/156, 1% [0% to 3%]), and no studies were pre-registered (0/156, 0% [0% to 1%]). Some articles explicitly disclosed funding sources (or lack of; 74/236, 31% [25% to 37%]) and some declared no conflicts of interest (36/236, 15% [11% to 20%]). Replication studies were rare (2/156, 1% [0% to 3%]). Few studies were included in evidence synthesis via systematic review (17/151, 11% [7% to 16%]) or meta-analysis (2/151, 1% [0% to 3%]). Less than half the articles were publicly available (101/250, 40% [34% to 47%]). Minimal adoption of transparency and reproducibility-related research practices could be undermining the credibility and efficiency of social science research. The present study establishes a baseline that can be revisited in the future to assess progress.
format Online
Article
Text
id pubmed-7062098
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher The Royal Society
record_format MEDLINE/PubMed
spelling pubmed-70620982020-03-31 An empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014–2017) Hardwicke, Tom E. Wallach, Joshua D. Kidwell, Mallory C. Bendixen, Theiss Crüwell, Sophia Ioannidis, John P. A. R Soc Open Sci Psychology and Cognitive Neuroscience Serious concerns about research quality have catalysed a number of reform initiatives intended to improve transparency and reproducibility and thus facilitate self-correction, increase efficiency and enhance research credibility. Meta-research has evaluated the merits of some individual initiatives; however, this may not capture broader trends reflecting the cumulative contribution of these efforts. In this study, we manually examined a random sample of 250 articles in order to estimate the prevalence of a range of transparency and reproducibility-related indicators in the social sciences literature published between 2014 and 2017. Few articles indicated availability of materials (16/151, 11% [95% confidence interval, 7% to 16%]), protocols (0/156, 0% [0% to 1%]), raw data (11/156, 7% [2% to 13%]) or analysis scripts (2/156, 1% [0% to 3%]), and no studies were pre-registered (0/156, 0% [0% to 1%]). Some articles explicitly disclosed funding sources (or lack of; 74/236, 31% [25% to 37%]) and some declared no conflicts of interest (36/236, 15% [11% to 20%]). Replication studies were rare (2/156, 1% [0% to 3%]). Few studies were included in evidence synthesis via systematic review (17/151, 11% [7% to 16%]) or meta-analysis (2/151, 1% [0% to 3%]). Less than half the articles were publicly available (101/250, 40% [34% to 47%]). Minimal adoption of transparency and reproducibility-related research practices could be undermining the credibility and efficiency of social science research. The present study establishes a baseline that can be revisited in the future to assess progress. The Royal Society 2020-02-19 /pmc/articles/PMC7062098/ /pubmed/32257301 http://dx.doi.org/10.1098/rsos.190806 Text en © 2020 The Authors. http://creativecommons.org/licenses/by/4.0/ Published by the Royal Society under the terms of the Creative Commons Attribution License http://creativecommons.org/licenses/by/4.0/, which permits unrestricted use, provided the original author and source are credited.
spellingShingle Psychology and Cognitive Neuroscience
Hardwicke, Tom E.
Wallach, Joshua D.
Kidwell, Mallory C.
Bendixen, Theiss
Crüwell, Sophia
Ioannidis, John P. A.
An empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014–2017)
title An empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014–2017)
title_full An empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014–2017)
title_fullStr An empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014–2017)
title_full_unstemmed An empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014–2017)
title_short An empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014–2017)
title_sort empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014–2017)
topic Psychology and Cognitive Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7062098/
https://www.ncbi.nlm.nih.gov/pubmed/32257301
http://dx.doi.org/10.1098/rsos.190806
work_keys_str_mv AT hardwicketome anempiricalassessmentoftransparencyandreproducibilityrelatedresearchpracticesinthesocialsciences20142017
AT wallachjoshuad anempiricalassessmentoftransparencyandreproducibilityrelatedresearchpracticesinthesocialsciences20142017
AT kidwellmalloryc anempiricalassessmentoftransparencyandreproducibilityrelatedresearchpracticesinthesocialsciences20142017
AT bendixentheiss anempiricalassessmentoftransparencyandreproducibilityrelatedresearchpracticesinthesocialsciences20142017
AT cruwellsophia anempiricalassessmentoftransparencyandreproducibilityrelatedresearchpracticesinthesocialsciences20142017
AT ioannidisjohnpa anempiricalassessmentoftransparencyandreproducibilityrelatedresearchpracticesinthesocialsciences20142017
AT hardwicketome empiricalassessmentoftransparencyandreproducibilityrelatedresearchpracticesinthesocialsciences20142017
AT wallachjoshuad empiricalassessmentoftransparencyandreproducibilityrelatedresearchpracticesinthesocialsciences20142017
AT kidwellmalloryc empiricalassessmentoftransparencyandreproducibilityrelatedresearchpracticesinthesocialsciences20142017
AT bendixentheiss empiricalassessmentoftransparencyandreproducibilityrelatedresearchpracticesinthesocialsciences20142017
AT cruwellsophia empiricalassessmentoftransparencyandreproducibilityrelatedresearchpracticesinthesocialsciences20142017
AT ioannidisjohnpa empiricalassessmentoftransparencyandreproducibilityrelatedresearchpracticesinthesocialsciences20142017