Cargando…

Effects of response format on achievement and aptitude assessment results: multi-level random effects meta-analyses

Psychological achievement and aptitude tests are fundamental elements of the everyday school, academic and professional lives of students, instructors, job applicants, researchers and policymakers. In line with growing demands for fair psychological assessment tools, we aimed to identify psychometri...

Descripción completa

Detalles Bibliográficos
Autores principales: Breuer, Sonja, Scherndl, Thomas, Ortner, Tuulia M.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: The Royal Society 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10154931/
https://www.ncbi.nlm.nih.gov/pubmed/37153364
http://dx.doi.org/10.1098/rsos.220456
_version_ 1785036229282103296
author Breuer, Sonja
Scherndl, Thomas
Ortner, Tuulia M.
author_facet Breuer, Sonja
Scherndl, Thomas
Ortner, Tuulia M.
author_sort Breuer, Sonja
collection PubMed
description Psychological achievement and aptitude tests are fundamental elements of the everyday school, academic and professional lives of students, instructors, job applicants, researchers and policymakers. In line with growing demands for fair psychological assessment tools, we aimed to identify psychometric features of tests, test situations and test-taker characteristics that may contribute to the emergence of test bias. Multi-level random effects meta-analyses were conducted to estimate mean effect sizes for differences and relations between scores from achievement or aptitude measures with open-ended (OE) versus closed-ended (CE) response formats. Results from 102 primary studies with 392 effect sizes revealed positive relations between CE and OE assessments (mean r = 0.67, 95% CI [0.57; 0.76]), with negative pooled effect sizes for the difference between the two response formats (mean d(av) = −0.65; 95% CI [−0.78; −0.53]). Significantly higher scores were obtained on CE exams. Stem-equivalency of items, low-stakes test situations, written short answer OE question types, studies conducted outside the United States and before the year 2000, and test-takers' achievement motivation and sex were at least partially associated with smaller differences and/or larger relations between scores from OE and CE formats. Limitations and the results’ implications for practitioners in achievement and aptitude testing are discussed.
format Online
Article
Text
id pubmed-10154931
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher The Royal Society
record_format MEDLINE/PubMed
spelling pubmed-101549312023-05-04 Effects of response format on achievement and aptitude assessment results: multi-level random effects meta-analyses Breuer, Sonja Scherndl, Thomas Ortner, Tuulia M. R Soc Open Sci Psychology and Cognitive Neuroscience Psychological achievement and aptitude tests are fundamental elements of the everyday school, academic and professional lives of students, instructors, job applicants, researchers and policymakers. In line with growing demands for fair psychological assessment tools, we aimed to identify psychometric features of tests, test situations and test-taker characteristics that may contribute to the emergence of test bias. Multi-level random effects meta-analyses were conducted to estimate mean effect sizes for differences and relations between scores from achievement or aptitude measures with open-ended (OE) versus closed-ended (CE) response formats. Results from 102 primary studies with 392 effect sizes revealed positive relations between CE and OE assessments (mean r = 0.67, 95% CI [0.57; 0.76]), with negative pooled effect sizes for the difference between the two response formats (mean d(av) = −0.65; 95% CI [−0.78; −0.53]). Significantly higher scores were obtained on CE exams. Stem-equivalency of items, low-stakes test situations, written short answer OE question types, studies conducted outside the United States and before the year 2000, and test-takers' achievement motivation and sex were at least partially associated with smaller differences and/or larger relations between scores from OE and CE formats. Limitations and the results’ implications for practitioners in achievement and aptitude testing are discussed. The Royal Society 2023-05-03 /pmc/articles/PMC10154931/ /pubmed/37153364 http://dx.doi.org/10.1098/rsos.220456 Text en © 2023 The Authors. https://creativecommons.org/licenses/by/4.0/Published by the Royal Society under the terms of the Creative Commons Attribution License http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, provided the original author and source are credited.
spellingShingle Psychology and Cognitive Neuroscience
Breuer, Sonja
Scherndl, Thomas
Ortner, Tuulia M.
Effects of response format on achievement and aptitude assessment results: multi-level random effects meta-analyses
title Effects of response format on achievement and aptitude assessment results: multi-level random effects meta-analyses
title_full Effects of response format on achievement and aptitude assessment results: multi-level random effects meta-analyses
title_fullStr Effects of response format on achievement and aptitude assessment results: multi-level random effects meta-analyses
title_full_unstemmed Effects of response format on achievement and aptitude assessment results: multi-level random effects meta-analyses
title_short Effects of response format on achievement and aptitude assessment results: multi-level random effects meta-analyses
title_sort effects of response format on achievement and aptitude assessment results: multi-level random effects meta-analyses
topic Psychology and Cognitive Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10154931/
https://www.ncbi.nlm.nih.gov/pubmed/37153364
http://dx.doi.org/10.1098/rsos.220456
work_keys_str_mv AT breuersonja effectsofresponseformatonachievementandaptitudeassessmentresultsmultilevelrandomeffectsmetaanalyses
AT scherndlthomas effectsofresponseformatonachievementandaptitudeassessmentresultsmultilevelrandomeffectsmetaanalyses
AT ortnertuuliam effectsofresponseformatonachievementandaptitudeassessmentresultsmultilevelrandomeffectsmetaanalyses