Cargando…

A comparison of 3- and 4-option multiple-choice items for medical subspecialty in-training examinations

BACKGROUND: The American Board of Anesthesiology piloted 3-option multiple-choice items (MCIs) for its 2020 administration of 150-item subspecialty in-training examinations for Critical Care Medicine (ITE-CCM) and Pediatric Anesthesiology (ITE-PA). The 3-option MCIs were transformed from their 4-opt...

Descripción completa

Detalles Bibliográficos
Autores principales: Chen, Dandan, Harman, Ann E., Sun, Huaping, Ye, Tianpeng, Gaiser, Robert R.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10134669/
https://www.ncbi.nlm.nih.gov/pubmed/37106417
http://dx.doi.org/10.1186/s12909-023-04277-2
_version_ 1785031809036189696
author Chen, Dandan
Harman, Ann E.
Sun, Huaping
Ye, Tianpeng
Gaiser, Robert R.
author_facet Chen, Dandan
Harman, Ann E.
Sun, Huaping
Ye, Tianpeng
Gaiser, Robert R.
author_sort Chen, Dandan
collection PubMed
description BACKGROUND: The American Board of Anesthesiology piloted 3-option multiple-choice items (MCIs) for its 2020 administration of 150-item subspecialty in-training examinations for Critical Care Medicine (ITE-CCM) and Pediatric Anesthesiology (ITE-PA). The 3-option MCIs were transformed from their 4-option counterparts, which were administered in 2019, by removing the least effective distractor. The purpose of this study was to compare physician performance, response time, and item and exam characteristics between the 4-option and 3-option exams. METHODS: Independent-samples t-test was used to examine the differences in physician percent-correct score; paired t-test was used to examine the differences in response time and item characteristics. The Kuder and Richardson Formula 20 was used to calculate the reliability of each exam form. Both the traditional (distractor being selected by fewer than 5% of examinees and/or showing a positive correlation with total score) and sliding scale (adjusting the frequency threshold of distractor being chosen by item difficulty) methods were used to identify non-functioning distractors (NFDs). RESULTS: Physicians who took the 3-option ITE-CCM (mean = 67.7%) scored 2.1 percent correct higher than those who took the 4-option ITE-CCM (65.7%). Accordingly, 3-option ITE-CCM items were significantly easier than their 4-option counterparts. No such differences were found between the 4-option and 3-option ITE-PAs (71.8% versus 71.7%). Item discrimination (4-option ITE-CCM [an average of 0.13], 3-option ITE-CCM [0.12]; 4-option ITE-PA [0.08], 3-option ITE-PA [0.09]) and exam reliability (0.75 and 0.74 for 4- and 3-option ITE-CCMs, respectively; 0.62 and 0.67 for 4-option and 3-option ITE-PAs, respectively) were similar between these two formats for both ITEs. On average, physicians spent 3.4 (55.5 versus 58.9) and 1.3 (46.2 versus 47.5) seconds less per item on 3-option items than 4-option items for ITE-CCM and ITE-PA, respectively. Using the traditional method, the percentage of NFDs dropped from 51.3% in the 4-option ITE-CCM to 37.0% in the 3-option ITE-CCM and from 62.7% to 46.0% for the ITE-PA; using the sliding scale method, the percentage of NFDs dropped from 36.0% to 21.7% for the ITE-CCM and from 44.9% to 27.7% for the ITE-PA. CONCLUSIONS: Three-option MCIs function as robustly as their 4-option counterparts. The efficiency achieved by spending less time on each item poses opportunities to increase content coverage for a fixed testing period. The results should be interpreted in the context of exam content and distribution of examinee abilities. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1186/s12909-023-04277-2.
format Online
Article
Text
id pubmed-10134669
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-101346692023-04-28 A comparison of 3- and 4-option multiple-choice items for medical subspecialty in-training examinations Chen, Dandan Harman, Ann E. Sun, Huaping Ye, Tianpeng Gaiser, Robert R. BMC Med Educ Research BACKGROUND: The American Board of Anesthesiology piloted 3-option multiple-choice items (MCIs) for its 2020 administration of 150-item subspecialty in-training examinations for Critical Care Medicine (ITE-CCM) and Pediatric Anesthesiology (ITE-PA). The 3-option MCIs were transformed from their 4-option counterparts, which were administered in 2019, by removing the least effective distractor. The purpose of this study was to compare physician performance, response time, and item and exam characteristics between the 4-option and 3-option exams. METHODS: Independent-samples t-test was used to examine the differences in physician percent-correct score; paired t-test was used to examine the differences in response time and item characteristics. The Kuder and Richardson Formula 20 was used to calculate the reliability of each exam form. Both the traditional (distractor being selected by fewer than 5% of examinees and/or showing a positive correlation with total score) and sliding scale (adjusting the frequency threshold of distractor being chosen by item difficulty) methods were used to identify non-functioning distractors (NFDs). RESULTS: Physicians who took the 3-option ITE-CCM (mean = 67.7%) scored 2.1 percent correct higher than those who took the 4-option ITE-CCM (65.7%). Accordingly, 3-option ITE-CCM items were significantly easier than their 4-option counterparts. No such differences were found between the 4-option and 3-option ITE-PAs (71.8% versus 71.7%). Item discrimination (4-option ITE-CCM [an average of 0.13], 3-option ITE-CCM [0.12]; 4-option ITE-PA [0.08], 3-option ITE-PA [0.09]) and exam reliability (0.75 and 0.74 for 4- and 3-option ITE-CCMs, respectively; 0.62 and 0.67 for 4-option and 3-option ITE-PAs, respectively) were similar between these two formats for both ITEs. On average, physicians spent 3.4 (55.5 versus 58.9) and 1.3 (46.2 versus 47.5) seconds less per item on 3-option items than 4-option items for ITE-CCM and ITE-PA, respectively. Using the traditional method, the percentage of NFDs dropped from 51.3% in the 4-option ITE-CCM to 37.0% in the 3-option ITE-CCM and from 62.7% to 46.0% for the ITE-PA; using the sliding scale method, the percentage of NFDs dropped from 36.0% to 21.7% for the ITE-CCM and from 44.9% to 27.7% for the ITE-PA. CONCLUSIONS: Three-option MCIs function as robustly as their 4-option counterparts. The efficiency achieved by spending less time on each item poses opportunities to increase content coverage for a fixed testing period. The results should be interpreted in the context of exam content and distribution of examinee abilities. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1186/s12909-023-04277-2. BioMed Central 2023-04-27 /pmc/articles/PMC10134669/ /pubmed/37106417 http://dx.doi.org/10.1186/s12909-023-04277-2 Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/ (https://creativecommons.org/publicdomain/zero/1.0/) ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
spellingShingle Research
Chen, Dandan
Harman, Ann E.
Sun, Huaping
Ye, Tianpeng
Gaiser, Robert R.
A comparison of 3- and 4-option multiple-choice items for medical subspecialty in-training examinations
title A comparison of 3- and 4-option multiple-choice items for medical subspecialty in-training examinations
title_full A comparison of 3- and 4-option multiple-choice items for medical subspecialty in-training examinations
title_fullStr A comparison of 3- and 4-option multiple-choice items for medical subspecialty in-training examinations
title_full_unstemmed A comparison of 3- and 4-option multiple-choice items for medical subspecialty in-training examinations
title_short A comparison of 3- and 4-option multiple-choice items for medical subspecialty in-training examinations
title_sort comparison of 3- and 4-option multiple-choice items for medical subspecialty in-training examinations
topic Research
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10134669/
https://www.ncbi.nlm.nih.gov/pubmed/37106417
http://dx.doi.org/10.1186/s12909-023-04277-2
work_keys_str_mv AT chendandan acomparisonof3and4optionmultiplechoiceitemsformedicalsubspecialtyintrainingexaminations
AT harmananne acomparisonof3and4optionmultiplechoiceitemsformedicalsubspecialtyintrainingexaminations
AT sunhuaping acomparisonof3and4optionmultiplechoiceitemsformedicalsubspecialtyintrainingexaminations
AT yetianpeng acomparisonof3and4optionmultiplechoiceitemsformedicalsubspecialtyintrainingexaminations
AT gaiserrobertr acomparisonof3and4optionmultiplechoiceitemsformedicalsubspecialtyintrainingexaminations
AT chendandan comparisonof3and4optionmultiplechoiceitemsformedicalsubspecialtyintrainingexaminations
AT harmananne comparisonof3and4optionmultiplechoiceitemsformedicalsubspecialtyintrainingexaminations
AT sunhuaping comparisonof3and4optionmultiplechoiceitemsformedicalsubspecialtyintrainingexaminations
AT yetianpeng comparisonof3and4optionmultiplechoiceitemsformedicalsubspecialtyintrainingexaminations
AT gaiserrobertr comparisonof3and4optionmultiplechoiceitemsformedicalsubspecialtyintrainingexaminations