Cargando…

Confounds in “Failed” Replications

Reproducibility is essential to science, yet a distressingly large number of research findings do not seem to replicate. Here I discuss one underappreciated reason for this state of affairs. I make my case by noting that, due to artifacts, several of the replication failures of the vastly advertised...

Descripción completa

Detalles Bibliográficos
Autor principal: Bressan, Paola
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6737580/
https://www.ncbi.nlm.nih.gov/pubmed/31551846
http://dx.doi.org/10.3389/fpsyg.2019.01884
_version_ 1783450680887869440
author Bressan, Paola
author_facet Bressan, Paola
author_sort Bressan, Paola
collection PubMed
description Reproducibility is essential to science, yet a distressingly large number of research findings do not seem to replicate. Here I discuss one underappreciated reason for this state of affairs. I make my case by noting that, due to artifacts, several of the replication failures of the vastly advertised Open Science Collaboration’s Reproducibility Project: Psychology turned out to be invalid. Although these artifacts would have been obvious on perusal of the data, such perusal was deemed undesirable because of its post hoc nature and was left out. However, while data do not lie, unforeseen confounds can render them unable to speak to the question of interest. I look further into one unusual case in which a major artifact could be removed statistically—the nonreplication of the effect of fertility on partnered women’s preference for single over attached men. I show that the “failed replication” datasets contain a gross bias in stimulus allocation which is absent in the original dataset; controlling for it replicates the original study’s main finding. I conclude that, before being used to make a scientific point, all data should undergo a minimal quality control—a provision, it appears, not always required of those collected for purpose of replication. Because unexpected confounds and biases can be laid bare only after the fact, we must get over our understandable reluctance to engage in anything post hoc. The reproach attached to p-hacking cannot exempt us from the obligation to (openly) take a good look at our data.
format Online
Article
Text
id pubmed-6737580
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-67375802019-09-24 Confounds in “Failed” Replications Bressan, Paola Front Psychol Psychology Reproducibility is essential to science, yet a distressingly large number of research findings do not seem to replicate. Here I discuss one underappreciated reason for this state of affairs. I make my case by noting that, due to artifacts, several of the replication failures of the vastly advertised Open Science Collaboration’s Reproducibility Project: Psychology turned out to be invalid. Although these artifacts would have been obvious on perusal of the data, such perusal was deemed undesirable because of its post hoc nature and was left out. However, while data do not lie, unforeseen confounds can render them unable to speak to the question of interest. I look further into one unusual case in which a major artifact could be removed statistically—the nonreplication of the effect of fertility on partnered women’s preference for single over attached men. I show that the “failed replication” datasets contain a gross bias in stimulus allocation which is absent in the original dataset; controlling for it replicates the original study’s main finding. I conclude that, before being used to make a scientific point, all data should undergo a minimal quality control—a provision, it appears, not always required of those collected for purpose of replication. Because unexpected confounds and biases can be laid bare only after the fact, we must get over our understandable reluctance to engage in anything post hoc. The reproach attached to p-hacking cannot exempt us from the obligation to (openly) take a good look at our data. Frontiers Media S.A. 2019-09-04 /pmc/articles/PMC6737580/ /pubmed/31551846 http://dx.doi.org/10.3389/fpsyg.2019.01884 Text en Copyright © 2019 Bressan. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Psychology
Bressan, Paola
Confounds in “Failed” Replications
title Confounds in “Failed” Replications
title_full Confounds in “Failed” Replications
title_fullStr Confounds in “Failed” Replications
title_full_unstemmed Confounds in “Failed” Replications
title_short Confounds in “Failed” Replications
title_sort confounds in “failed” replications
topic Psychology
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6737580/
https://www.ncbi.nlm.nih.gov/pubmed/31551846
http://dx.doi.org/10.3389/fpsyg.2019.01884
work_keys_str_mv AT bressanpaola confoundsinfailedreplications