Cargando…

Comparing single-best-answer and very-short-answer questions for the assessment of applied medical knowledge in 20 UK medical schools: Cross-sectional study

OBJECTIVES: The study aimed to compare candidate performance between traditional best-of-five single-best-answer (SBA) questions and very-short-answer (VSA) questions, in which candidates must generate their own answers of between one and five words. The primary objective was to determine if the mea...

Descripción completa

Detalles Bibliográficos
Autores principales: Sam, Amir H, Westacott, Rachel, Gurnell, Mark, Wilson, Rebecca, Meeran, Karim, Brown, Celia
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BMJ Publishing Group 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6773319/
https://www.ncbi.nlm.nih.gov/pubmed/31558462
http://dx.doi.org/10.1136/bmjopen-2019-032550
_version_ 1783455877681905664
author Sam, Amir H
Westacott, Rachel
Gurnell, Mark
Wilson, Rebecca
Meeran, Karim
Brown, Celia
author_facet Sam, Amir H
Westacott, Rachel
Gurnell, Mark
Wilson, Rebecca
Meeran, Karim
Brown, Celia
author_sort Sam, Amir H
collection PubMed
description OBJECTIVES: The study aimed to compare candidate performance between traditional best-of-five single-best-answer (SBA) questions and very-short-answer (VSA) questions, in which candidates must generate their own answers of between one and five words. The primary objective was to determine if the mean positive cue rate for SBAs exceeded the null hypothesis guessing rate of 20%. DESIGN: This was a cross-sectional study undertaken in 2018. SETTING: 20 medical schools in the UK. PARTICIPANTS: 1417 volunteer medical students preparing for their final undergraduate medicine examinations (total eligible population across all UK medical schools approximately 7500). INTERVENTIONS: Students completed a 50-question VSA test, followed immediately by the same test in SBA format, using a novel digital exam delivery platform which also facilitated rapid marking of VSAs. MAIN OUTCOME MEASURES: The main outcome measure was the mean positive cue rate across SBAs: the percentage of students getting the SBA format of the question correct after getting the VSA format incorrect. Internal consistency, item discrimination and the pass rate using Cohen standard setting for VSAs and SBAs were also evaluated, and a cost analysis in terms of marking the VSA was performed. RESULTS: The study was completed by 1417 students. Mean student scores were 21 percentage points higher for SBAs. The mean positive cue rate was 42.7% (95% CI 36.8% to 48.6%), one-sample t-test against ≤20%: t=7.53, p<0.001. Internal consistency was higher for VSAs than SBAs and the median item discrimination equivalent. The estimated marking cost was £2655 ($3500), with 24.5 hours of clinician time required (1.25 s per student per question). CONCLUSIONS: SBA questions can give a false impression of students’ competence. VSAs appear to have greater authenticity and can provide useful information regarding students’ cognitive errors, helping to improve learning as well as assessment. Electronic delivery and marking of VSAs is feasible and cost-effective.
format Online
Article
Text
id pubmed-6773319
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher BMJ Publishing Group
record_format MEDLINE/PubMed
spelling pubmed-67733192019-10-21 Comparing single-best-answer and very-short-answer questions for the assessment of applied medical knowledge in 20 UK medical schools: Cross-sectional study Sam, Amir H Westacott, Rachel Gurnell, Mark Wilson, Rebecca Meeran, Karim Brown, Celia BMJ Open Medical Education and Training OBJECTIVES: The study aimed to compare candidate performance between traditional best-of-five single-best-answer (SBA) questions and very-short-answer (VSA) questions, in which candidates must generate their own answers of between one and five words. The primary objective was to determine if the mean positive cue rate for SBAs exceeded the null hypothesis guessing rate of 20%. DESIGN: This was a cross-sectional study undertaken in 2018. SETTING: 20 medical schools in the UK. PARTICIPANTS: 1417 volunteer medical students preparing for their final undergraduate medicine examinations (total eligible population across all UK medical schools approximately 7500). INTERVENTIONS: Students completed a 50-question VSA test, followed immediately by the same test in SBA format, using a novel digital exam delivery platform which also facilitated rapid marking of VSAs. MAIN OUTCOME MEASURES: The main outcome measure was the mean positive cue rate across SBAs: the percentage of students getting the SBA format of the question correct after getting the VSA format incorrect. Internal consistency, item discrimination and the pass rate using Cohen standard setting for VSAs and SBAs were also evaluated, and a cost analysis in terms of marking the VSA was performed. RESULTS: The study was completed by 1417 students. Mean student scores were 21 percentage points higher for SBAs. The mean positive cue rate was 42.7% (95% CI 36.8% to 48.6%), one-sample t-test against ≤20%: t=7.53, p<0.001. Internal consistency was higher for VSAs than SBAs and the median item discrimination equivalent. The estimated marking cost was £2655 ($3500), with 24.5 hours of clinician time required (1.25 s per student per question). CONCLUSIONS: SBA questions can give a false impression of students’ competence. VSAs appear to have greater authenticity and can provide useful information regarding students’ cognitive errors, helping to improve learning as well as assessment. Electronic delivery and marking of VSAs is feasible and cost-effective. BMJ Publishing Group 2019-09-26 /pmc/articles/PMC6773319/ /pubmed/31558462 http://dx.doi.org/10.1136/bmjopen-2019-032550 Text en © Author(s) (or their employer(s)) 2019. Re-use permitted under CC BY-NC. No commercial re-use. See rights and permissions. Published by BMJ. This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.
spellingShingle Medical Education and Training
Sam, Amir H
Westacott, Rachel
Gurnell, Mark
Wilson, Rebecca
Meeran, Karim
Brown, Celia
Comparing single-best-answer and very-short-answer questions for the assessment of applied medical knowledge in 20 UK medical schools: Cross-sectional study
title Comparing single-best-answer and very-short-answer questions for the assessment of applied medical knowledge in 20 UK medical schools: Cross-sectional study
title_full Comparing single-best-answer and very-short-answer questions for the assessment of applied medical knowledge in 20 UK medical schools: Cross-sectional study
title_fullStr Comparing single-best-answer and very-short-answer questions for the assessment of applied medical knowledge in 20 UK medical schools: Cross-sectional study
title_full_unstemmed Comparing single-best-answer and very-short-answer questions for the assessment of applied medical knowledge in 20 UK medical schools: Cross-sectional study
title_short Comparing single-best-answer and very-short-answer questions for the assessment of applied medical knowledge in 20 UK medical schools: Cross-sectional study
title_sort comparing single-best-answer and very-short-answer questions for the assessment of applied medical knowledge in 20 uk medical schools: cross-sectional study
topic Medical Education and Training
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6773319/
https://www.ncbi.nlm.nih.gov/pubmed/31558462
http://dx.doi.org/10.1136/bmjopen-2019-032550
work_keys_str_mv AT samamirh comparingsinglebestanswerandveryshortanswerquestionsfortheassessmentofappliedmedicalknowledgein20ukmedicalschoolscrosssectionalstudy
AT westacottrachel comparingsinglebestanswerandveryshortanswerquestionsfortheassessmentofappliedmedicalknowledgein20ukmedicalschoolscrosssectionalstudy
AT gurnellmark comparingsinglebestanswerandveryshortanswerquestionsfortheassessmentofappliedmedicalknowledgein20ukmedicalschoolscrosssectionalstudy
AT wilsonrebecca comparingsinglebestanswerandveryshortanswerquestionsfortheassessmentofappliedmedicalknowledgein20ukmedicalschoolscrosssectionalstudy
AT meerankarim comparingsinglebestanswerandveryshortanswerquestionsfortheassessmentofappliedmedicalknowledgein20ukmedicalschoolscrosssectionalstudy
AT browncelia comparingsinglebestanswerandveryshortanswerquestionsfortheassessmentofappliedmedicalknowledgein20ukmedicalschoolscrosssectionalstudy