Cargando…

An IRT–Multiple Indicators Multiple Causes (MIMIC) Approach as a Method of Examining Item Response Latency

The analysis of response time has received increasing attention during the last decades, since evidence from several studies supported the argument that there is a direct relationship between item response time and test performance. The aim of this study was to investigate whether item response late...

Descripción completa

Detalles Bibliográficos
Autores principales: Tsaousis, Ioannis, Sideridis, Georgios D., Al-Sadaawi, Abdullah
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6277868/
https://www.ncbi.nlm.nih.gov/pubmed/30542303
http://dx.doi.org/10.3389/fpsyg.2018.02177
_version_ 1783378245450727424
author Tsaousis, Ioannis
Sideridis, Georgios D.
Al-Sadaawi, Abdullah
author_facet Tsaousis, Ioannis
Sideridis, Georgios D.
Al-Sadaawi, Abdullah
author_sort Tsaousis, Ioannis
collection PubMed
description The analysis of response time has received increasing attention during the last decades, since evidence from several studies supported the argument that there is a direct relationship between item response time and test performance. The aim of this study was to investigate whether item response latency affects person's ability parameters, in that it represents an adaptive or maladaptive practice. To examine the above research question data from 8,475 individuals completing the computerized version of the Postgraduate General Aptitude Test (PAGAT) were analyzed. To determine the extent to which response latency affects person's ability, we used a Multiple Indicators Multiple Causes (MIMIC) model, in which every item in a scale was linked to its corresponding covariate (i.e., item response latency). We ran the MIMIC model within the Item Response Theory (IRT) framework (2-PL model). The results supported the hypothesis that item response latency could provide valuable information for getting more accurate estimations for persons' ability levels. Results indicated that for individuals who invest more time on easy items, their likelihood of success does not improve, most likely because slow and fast responders have significantly different levels of ability (fast responders are of higher ability compared to slow responders). Consequently, investing more time for low ability individuals does not prove to be adaptive. The opposite was found for difficult items: individuals spending more time on difficult items increase their likelihood of success, more likely because they are high achievers (in difficult items individuals who spent more time were of significantly higher ability compared to fast responders). Thus, it appears that there is an interaction between the difficulty of the item and person abilities that explain the effects of response time on likelihood of success. We concluded that accommodating item response latency in a computerized assessment model, can inform test quality and test takers' behavior, and in that way, enhance score measurement accuracy.
format Online
Article
Text
id pubmed-6277868
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-62778682018-12-12 An IRT–Multiple Indicators Multiple Causes (MIMIC) Approach as a Method of Examining Item Response Latency Tsaousis, Ioannis Sideridis, Georgios D. Al-Sadaawi, Abdullah Front Psychol Psychology The analysis of response time has received increasing attention during the last decades, since evidence from several studies supported the argument that there is a direct relationship between item response time and test performance. The aim of this study was to investigate whether item response latency affects person's ability parameters, in that it represents an adaptive or maladaptive practice. To examine the above research question data from 8,475 individuals completing the computerized version of the Postgraduate General Aptitude Test (PAGAT) were analyzed. To determine the extent to which response latency affects person's ability, we used a Multiple Indicators Multiple Causes (MIMIC) model, in which every item in a scale was linked to its corresponding covariate (i.e., item response latency). We ran the MIMIC model within the Item Response Theory (IRT) framework (2-PL model). The results supported the hypothesis that item response latency could provide valuable information for getting more accurate estimations for persons' ability levels. Results indicated that for individuals who invest more time on easy items, their likelihood of success does not improve, most likely because slow and fast responders have significantly different levels of ability (fast responders are of higher ability compared to slow responders). Consequently, investing more time for low ability individuals does not prove to be adaptive. The opposite was found for difficult items: individuals spending more time on difficult items increase their likelihood of success, more likely because they are high achievers (in difficult items individuals who spent more time were of significantly higher ability compared to fast responders). Thus, it appears that there is an interaction between the difficulty of the item and person abilities that explain the effects of response time on likelihood of success. We concluded that accommodating item response latency in a computerized assessment model, can inform test quality and test takers' behavior, and in that way, enhance score measurement accuracy. Frontiers Media S.A. 2018-11-13 /pmc/articles/PMC6277868/ /pubmed/30542303 http://dx.doi.org/10.3389/fpsyg.2018.02177 Text en Copyright © 2018 Tsaousis, Sideridis and Al-Sadaawi. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Psychology
Tsaousis, Ioannis
Sideridis, Georgios D.
Al-Sadaawi, Abdullah
An IRT–Multiple Indicators Multiple Causes (MIMIC) Approach as a Method of Examining Item Response Latency
title An IRT–Multiple Indicators Multiple Causes (MIMIC) Approach as a Method of Examining Item Response Latency
title_full An IRT–Multiple Indicators Multiple Causes (MIMIC) Approach as a Method of Examining Item Response Latency
title_fullStr An IRT–Multiple Indicators Multiple Causes (MIMIC) Approach as a Method of Examining Item Response Latency
title_full_unstemmed An IRT–Multiple Indicators Multiple Causes (MIMIC) Approach as a Method of Examining Item Response Latency
title_short An IRT–Multiple Indicators Multiple Causes (MIMIC) Approach as a Method of Examining Item Response Latency
title_sort irt–multiple indicators multiple causes (mimic) approach as a method of examining item response latency
topic Psychology
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6277868/
https://www.ncbi.nlm.nih.gov/pubmed/30542303
http://dx.doi.org/10.3389/fpsyg.2018.02177
work_keys_str_mv AT tsaousisioannis anirtmultipleindicatorsmultiplecausesmimicapproachasamethodofexaminingitemresponselatency
AT sideridisgeorgiosd anirtmultipleindicatorsmultiplecausesmimicapproachasamethodofexaminingitemresponselatency
AT alsadaawiabdullah anirtmultipleindicatorsmultiplecausesmimicapproachasamethodofexaminingitemresponselatency
AT tsaousisioannis irtmultipleindicatorsmultiplecausesmimicapproachasamethodofexaminingitemresponselatency
AT sideridisgeorgiosd irtmultipleindicatorsmultiplecausesmimicapproachasamethodofexaminingitemresponselatency
AT alsadaawiabdullah irtmultipleindicatorsmultiplecausesmimicapproachasamethodofexaminingitemresponselatency