Cargando…

Classification Accuracy of Mixed Format Tests: A Bi-Factor Item Response Theory Approach

Mixed format tests (e.g., a test consisting of multiple-choice [MC] items and constructed response [CR] items) have become increasingly popular. However, the latent structure of item pools consisting of the two formats is still equivocal. Moreover, the implications of this latent structure are uncle...

Descripción completa

Detalles Bibliográficos
Autores principales: Wang, Wei, Drasgow, Fritz, Liu, Liwen
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2016
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4770050/
https://www.ncbi.nlm.nih.gov/pubmed/26973568
http://dx.doi.org/10.3389/fpsyg.2016.00270
_version_ 1782418186937827328
author Wang, Wei
Drasgow, Fritz
Liu, Liwen
author_facet Wang, Wei
Drasgow, Fritz
Liu, Liwen
author_sort Wang, Wei
collection PubMed
description Mixed format tests (e.g., a test consisting of multiple-choice [MC] items and constructed response [CR] items) have become increasingly popular. However, the latent structure of item pools consisting of the two formats is still equivocal. Moreover, the implications of this latent structure are unclear: For example, do constructed response items tap reasoning skills that cannot be assessed with multiple choice items? This study explored the dimensionality of mixed format tests by applying bi-factor models to 10 tests of various subjects from the College Board's Advanced Placement (AP) Program and compared the accuracy of scores based on the bi-factor analysis with scores derived from a unidimensional analysis. More importantly, this study focused on a practical and important question—classification accuracy of the overall grade on a mixed format test. Our findings revealed that the degree of multidimensionality resulting from the mixed item format varied from subject to subject, depending on the disattenuated correlation between scores from MC and CR subtests. Moreover, remarkably small decrements in classification accuracy were found for the unidimensional analysis when the disattenuated correlations exceeded 0.90.
format Online
Article
Text
id pubmed-4770050
institution National Center for Biotechnology Information
language English
publishDate 2016
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-47700502016-03-11 Classification Accuracy of Mixed Format Tests: A Bi-Factor Item Response Theory Approach Wang, Wei Drasgow, Fritz Liu, Liwen Front Psychol Psychology Mixed format tests (e.g., a test consisting of multiple-choice [MC] items and constructed response [CR] items) have become increasingly popular. However, the latent structure of item pools consisting of the two formats is still equivocal. Moreover, the implications of this latent structure are unclear: For example, do constructed response items tap reasoning skills that cannot be assessed with multiple choice items? This study explored the dimensionality of mixed format tests by applying bi-factor models to 10 tests of various subjects from the College Board's Advanced Placement (AP) Program and compared the accuracy of scores based on the bi-factor analysis with scores derived from a unidimensional analysis. More importantly, this study focused on a practical and important question—classification accuracy of the overall grade on a mixed format test. Our findings revealed that the degree of multidimensionality resulting from the mixed item format varied from subject to subject, depending on the disattenuated correlation between scores from MC and CR subtests. Moreover, remarkably small decrements in classification accuracy were found for the unidimensional analysis when the disattenuated correlations exceeded 0.90. Frontiers Media S.A. 2016-02-29 /pmc/articles/PMC4770050/ /pubmed/26973568 http://dx.doi.org/10.3389/fpsyg.2016.00270 Text en Copyright © 2016 Wang, Drasgow and Liu. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Psychology
Wang, Wei
Drasgow, Fritz
Liu, Liwen
Classification Accuracy of Mixed Format Tests: A Bi-Factor Item Response Theory Approach
title Classification Accuracy of Mixed Format Tests: A Bi-Factor Item Response Theory Approach
title_full Classification Accuracy of Mixed Format Tests: A Bi-Factor Item Response Theory Approach
title_fullStr Classification Accuracy of Mixed Format Tests: A Bi-Factor Item Response Theory Approach
title_full_unstemmed Classification Accuracy of Mixed Format Tests: A Bi-Factor Item Response Theory Approach
title_short Classification Accuracy of Mixed Format Tests: A Bi-Factor Item Response Theory Approach
title_sort classification accuracy of mixed format tests: a bi-factor item response theory approach
topic Psychology
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4770050/
https://www.ncbi.nlm.nih.gov/pubmed/26973568
http://dx.doi.org/10.3389/fpsyg.2016.00270
work_keys_str_mv AT wangwei classificationaccuracyofmixedformattestsabifactoritemresponsetheoryapproach
AT drasgowfritz classificationaccuracyofmixedformattestsabifactoritemresponsetheoryapproach
AT liuliwen classificationaccuracyofmixedformattestsabifactoritemresponsetheoryapproach