Cargando…

Meta-analyses in psychology often overestimate evidence for and size of effects

Adjusting for publication bias is essential when drawing meta-analytic inferences. However, most methods that adjust for publication bias do not perform well across a range of research conditions, such as the degree of heterogeneity in effect sizes across studies. Sladekova et al. 2022 (Estimating t...

Descripción completa

Detalles Bibliográficos
Autores principales: Bartoš, František, Maier, Maximilian, Shanks, David R., Stanley, T. D., Sladekova, Martina, Wagenmakers, Eric-Jan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: The Royal Society 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10320355/
https://www.ncbi.nlm.nih.gov/pubmed/37416830
http://dx.doi.org/10.1098/rsos.230224
_version_ 1785068435130023936
author Bartoš, František
Maier, Maximilian
Shanks, David R.
Stanley, T. D.
Sladekova, Martina
Wagenmakers, Eric-Jan
author_facet Bartoš, František
Maier, Maximilian
Shanks, David R.
Stanley, T. D.
Sladekova, Martina
Wagenmakers, Eric-Jan
author_sort Bartoš, František
collection PubMed
description Adjusting for publication bias is essential when drawing meta-analytic inferences. However, most methods that adjust for publication bias do not perform well across a range of research conditions, such as the degree of heterogeneity in effect sizes across studies. Sladekova et al. 2022 (Estimating the change in meta-analytic effect size estimates after the application of publication bias adjustment methods. Psychol. Methods) tried to circumvent this complication by selecting the methods that are most appropriate for a given set of conditions, and concluded that publication bias on average causes only minimal over-estimation of effect sizes in psychology. However, this approach suffers from a ‘Catch-22’ problem—to know the underlying research conditions, one needs to have adjusted for publication bias correctly, but to correctly adjust for publication bias, one needs to know the underlying research conditions. To alleviate this problem, we conduct an alternative analysis, robust Bayesian meta-analysis (RoBMA), which is not based on model-selection but on model-averaging. In RoBMA, models that predict the observed results better are given correspondingly larger weights. A RoBMA reanalysis of Sladekova et al.’s dataset reveals that more than 60% of meta-analyses in psychology notably overestimate the evidence for the presence of the meta-analytic effect and more than 50% overestimate its magnitude.
format Online
Article
Text
id pubmed-10320355
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher The Royal Society
record_format MEDLINE/PubMed
spelling pubmed-103203552023-07-06 Meta-analyses in psychology often overestimate evidence for and size of effects Bartoš, František Maier, Maximilian Shanks, David R. Stanley, T. D. Sladekova, Martina Wagenmakers, Eric-Jan R Soc Open Sci Psychology and Cognitive Neuroscience Adjusting for publication bias is essential when drawing meta-analytic inferences. However, most methods that adjust for publication bias do not perform well across a range of research conditions, such as the degree of heterogeneity in effect sizes across studies. Sladekova et al. 2022 (Estimating the change in meta-analytic effect size estimates after the application of publication bias adjustment methods. Psychol. Methods) tried to circumvent this complication by selecting the methods that are most appropriate for a given set of conditions, and concluded that publication bias on average causes only minimal over-estimation of effect sizes in psychology. However, this approach suffers from a ‘Catch-22’ problem—to know the underlying research conditions, one needs to have adjusted for publication bias correctly, but to correctly adjust for publication bias, one needs to know the underlying research conditions. To alleviate this problem, we conduct an alternative analysis, robust Bayesian meta-analysis (RoBMA), which is not based on model-selection but on model-averaging. In RoBMA, models that predict the observed results better are given correspondingly larger weights. A RoBMA reanalysis of Sladekova et al.’s dataset reveals that more than 60% of meta-analyses in psychology notably overestimate the evidence for the presence of the meta-analytic effect and more than 50% overestimate its magnitude. The Royal Society 2023-07-05 /pmc/articles/PMC10320355/ /pubmed/37416830 http://dx.doi.org/10.1098/rsos.230224 Text en © 2023 The Authors. https://creativecommons.org/licenses/by/4.0/Published by the Royal Society under the terms of the Creative Commons Attribution License http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, provided the original author and source are credited.
spellingShingle Psychology and Cognitive Neuroscience
Bartoš, František
Maier, Maximilian
Shanks, David R.
Stanley, T. D.
Sladekova, Martina
Wagenmakers, Eric-Jan
Meta-analyses in psychology often overestimate evidence for and size of effects
title Meta-analyses in psychology often overestimate evidence for and size of effects
title_full Meta-analyses in psychology often overestimate evidence for and size of effects
title_fullStr Meta-analyses in psychology often overestimate evidence for and size of effects
title_full_unstemmed Meta-analyses in psychology often overestimate evidence for and size of effects
title_short Meta-analyses in psychology often overestimate evidence for and size of effects
title_sort meta-analyses in psychology often overestimate evidence for and size of effects
topic Psychology and Cognitive Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10320355/
https://www.ncbi.nlm.nih.gov/pubmed/37416830
http://dx.doi.org/10.1098/rsos.230224
work_keys_str_mv AT bartosfrantisek metaanalysesinpsychologyoftenoverestimateevidenceforandsizeofeffects
AT maiermaximilian metaanalysesinpsychologyoftenoverestimateevidenceforandsizeofeffects
AT shanksdavidr metaanalysesinpsychologyoftenoverestimateevidenceforandsizeofeffects
AT stanleytd metaanalysesinpsychologyoftenoverestimateevidenceforandsizeofeffects
AT sladekovamartina metaanalysesinpsychologyoftenoverestimateevidenceforandsizeofeffects
AT wagenmakersericjan metaanalysesinpsychologyoftenoverestimateevidenceforandsizeofeffects