Cargando…
Null and Void? Errors in Meta-analysis on Perceptual Disfluency and Recommendations to Improve Meta-analytical Reproducibility
In the 2018 meta-analysis of Educational Psychology Review entitled “Null effects of perceptual disfluency on learning outcomes in a text-based educational context” by Xie, Zhou, and Liu, we identify some errors and inconsistencies in both the methodological approach and the reported results regardi...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Springer US
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7854329/ https://www.ncbi.nlm.nih.gov/pubmed/33551625 http://dx.doi.org/10.1007/s10648-020-09579-1 |
_version_ | 1783646068004618240 |
---|---|
author | Weissgerber, Sophia C. Brunmair, Matthias Rummer, Ralf |
author_facet | Weissgerber, Sophia C. Brunmair, Matthias Rummer, Ralf |
author_sort | Weissgerber, Sophia C. |
collection | PubMed |
description | In the 2018 meta-analysis of Educational Psychology Review entitled “Null effects of perceptual disfluency on learning outcomes in a text-based educational context” by Xie, Zhou, and Liu, we identify some errors and inconsistencies in both the methodological approach and the reported results regarding coding and effect sizes. While from a technical point of view the meta-analysis aligns with current meta-analytical guidelines (e.g., PRISMA) and conforms to general meta-analytical requirements (e.g., considering publication bias), it exemplifies certain insufficient practices in the creation and review of meta-analysis. We criticize the lack of transparency and negligence of open-science practices in the generation and reporting of results, which complicate evaluation of the meta-analytical reproducibility, especially given the flexibility in subjective choices regarding the analytical approach and the flexibility in creating the database. Here we present a framework applicable to pre- and post-publication review on improving the Methods Reproducibility of meta-analysis. Based on considerations of the transparency and openness (TOP)-guidlines (Nosek et al. Science 348: 1422–1425, 2015), the Reproducibility Enhancement Principles (REP; Stodden et al. Science 354:1240–1241, 2016), and recommendations by Lakens et al. (BMC Psychology 4: Article 24, 2016), we outline Computational Reproducibility (Level 1), Computational Verification (Level 2), Analysis Reproducibility (Level 3), and Outcome Reproducibility (Level 4). Applying reproducibility checks to TRANSFER performance as the chosen outcome variable, we found Xie’s and colleagues’ results to be (rather) robust. Yet, regarding RECALL performance and the moderator analysis, the identified problems raise doubts about the credibility of the reported results. |
format | Online Article Text |
id | pubmed-7854329 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | Springer US |
record_format | MEDLINE/PubMed |
spelling | pubmed-78543292021-02-03 Null and Void? Errors in Meta-analysis on Perceptual Disfluency and Recommendations to Improve Meta-analytical Reproducibility Weissgerber, Sophia C. Brunmair, Matthias Rummer, Ralf Educ Psychol Rev Commentary In the 2018 meta-analysis of Educational Psychology Review entitled “Null effects of perceptual disfluency on learning outcomes in a text-based educational context” by Xie, Zhou, and Liu, we identify some errors and inconsistencies in both the methodological approach and the reported results regarding coding and effect sizes. While from a technical point of view the meta-analysis aligns with current meta-analytical guidelines (e.g., PRISMA) and conforms to general meta-analytical requirements (e.g., considering publication bias), it exemplifies certain insufficient practices in the creation and review of meta-analysis. We criticize the lack of transparency and negligence of open-science practices in the generation and reporting of results, which complicate evaluation of the meta-analytical reproducibility, especially given the flexibility in subjective choices regarding the analytical approach and the flexibility in creating the database. Here we present a framework applicable to pre- and post-publication review on improving the Methods Reproducibility of meta-analysis. Based on considerations of the transparency and openness (TOP)-guidlines (Nosek et al. Science 348: 1422–1425, 2015), the Reproducibility Enhancement Principles (REP; Stodden et al. Science 354:1240–1241, 2016), and recommendations by Lakens et al. (BMC Psychology 4: Article 24, 2016), we outline Computational Reproducibility (Level 1), Computational Verification (Level 2), Analysis Reproducibility (Level 3), and Outcome Reproducibility (Level 4). Applying reproducibility checks to TRANSFER performance as the chosen outcome variable, we found Xie’s and colleagues’ results to be (rather) robust. Yet, regarding RECALL performance and the moderator analysis, the identified problems raise doubts about the credibility of the reported results. Springer US 2021-02-03 2021 /pmc/articles/PMC7854329/ /pubmed/33551625 http://dx.doi.org/10.1007/s10648-020-09579-1 Text en © The Author(s) 2021 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Commentary Weissgerber, Sophia C. Brunmair, Matthias Rummer, Ralf Null and Void? Errors in Meta-analysis on Perceptual Disfluency and Recommendations to Improve Meta-analytical Reproducibility |
title | Null and Void? Errors in Meta-analysis on Perceptual Disfluency and Recommendations to Improve Meta-analytical Reproducibility |
title_full | Null and Void? Errors in Meta-analysis on Perceptual Disfluency and Recommendations to Improve Meta-analytical Reproducibility |
title_fullStr | Null and Void? Errors in Meta-analysis on Perceptual Disfluency and Recommendations to Improve Meta-analytical Reproducibility |
title_full_unstemmed | Null and Void? Errors in Meta-analysis on Perceptual Disfluency and Recommendations to Improve Meta-analytical Reproducibility |
title_short | Null and Void? Errors in Meta-analysis on Perceptual Disfluency and Recommendations to Improve Meta-analytical Reproducibility |
title_sort | null and void? errors in meta-analysis on perceptual disfluency and recommendations to improve meta-analytical reproducibility |
topic | Commentary |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7854329/ https://www.ncbi.nlm.nih.gov/pubmed/33551625 http://dx.doi.org/10.1007/s10648-020-09579-1 |
work_keys_str_mv | AT weissgerbersophiac nullandvoiderrorsinmetaanalysisonperceptualdisfluencyandrecommendationstoimprovemetaanalyticalreproducibility AT brunmairmatthias nullandvoiderrorsinmetaanalysisonperceptualdisfluencyandrecommendationstoimprovemetaanalyticalreproducibility AT rummerralf nullandvoiderrorsinmetaanalysisonperceptualdisfluencyandrecommendationstoimprovemetaanalyticalreproducibility |