Cargando…

Medical students create multiple-choice questions for learning in pathology education: a pilot study

BACKGROUND: Medical students facing high-stakes exams want study resources that have a direct relationship with their assessments. At the same time, they need to develop the skills to think analytically about complex clinical problems. Multiple-choice questions (MCQs) are widely used in medical educ...

Descripción completa

Detalles Bibliográficos
Autores principales: Grainger, Rebecca, Dai, Wei, Osborne, Emma, Kenwright, Diane
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6103861/
https://www.ncbi.nlm.nih.gov/pubmed/30134898
http://dx.doi.org/10.1186/s12909-018-1312-1
_version_ 1783349384336900096
author Grainger, Rebecca
Dai, Wei
Osborne, Emma
Kenwright, Diane
author_facet Grainger, Rebecca
Dai, Wei
Osborne, Emma
Kenwright, Diane
author_sort Grainger, Rebecca
collection PubMed
description BACKGROUND: Medical students facing high-stakes exams want study resources that have a direct relationship with their assessments. At the same time, they need to develop the skills to think analytically about complex clinical problems. Multiple-choice questions (MCQs) are widely used in medical education and can promote surface learning strategies, but creating MCQs requires both in-depth content knowledge and sophisticated analytical thinking. Therefore, we piloted an MCQ-writing task in which students developed MCQs for their peers to answer. METHODS: Students in a fourth-year anatomic pathology course (N = 106) were required to write MCQs using the PeerWise platform. Students created two MCQs for each of four topic areas and the MCQs were answered, rated and commented on by their classmates. Questions were rated for cognitive complexity and a paper-based survey was administered to investigate whether this activity was acceptable, feasible, and whether it promoted desirable learning behaviours in students. RESULTS: Students were able to create cognitively challenging MCQs: 313/421 (74%) of the MCQs which we rated required the respondent to apply or analyse pathology knowledge. However, students who responded to the end-of-course questionnaire (N = 62) saw the task as having little educational value. Students found PeerWise easy to use, and indicated that they read widely to prepare questions and monitored the quality of their questions. They did not, however, engage in extensive peer feedback via PeerWise. CONCLUSIONS: Our study showed that the MCQ writing task was feasible and engaged students in self-evaluation and synthesising information from a range of sources, but it was not well accepted and did not strongly engage students in peer-learning. Although students were able to create complex MCQs, they found some aspects of the writing process burdensome and tended not to trust the quality of each other’s MCQs. Because of the evidence this task did promote deep learning, it is worth continuing this mode of teaching if the task can be made more acceptable to students.
format Online
Article
Text
id pubmed-6103861
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-61038612018-08-30 Medical students create multiple-choice questions for learning in pathology education: a pilot study Grainger, Rebecca Dai, Wei Osborne, Emma Kenwright, Diane BMC Med Educ Research Article BACKGROUND: Medical students facing high-stakes exams want study resources that have a direct relationship with their assessments. At the same time, they need to develop the skills to think analytically about complex clinical problems. Multiple-choice questions (MCQs) are widely used in medical education and can promote surface learning strategies, but creating MCQs requires both in-depth content knowledge and sophisticated analytical thinking. Therefore, we piloted an MCQ-writing task in which students developed MCQs for their peers to answer. METHODS: Students in a fourth-year anatomic pathology course (N = 106) were required to write MCQs using the PeerWise platform. Students created two MCQs for each of four topic areas and the MCQs were answered, rated and commented on by their classmates. Questions were rated for cognitive complexity and a paper-based survey was administered to investigate whether this activity was acceptable, feasible, and whether it promoted desirable learning behaviours in students. RESULTS: Students were able to create cognitively challenging MCQs: 313/421 (74%) of the MCQs which we rated required the respondent to apply or analyse pathology knowledge. However, students who responded to the end-of-course questionnaire (N = 62) saw the task as having little educational value. Students found PeerWise easy to use, and indicated that they read widely to prepare questions and monitored the quality of their questions. They did not, however, engage in extensive peer feedback via PeerWise. CONCLUSIONS: Our study showed that the MCQ writing task was feasible and engaged students in self-evaluation and synthesising information from a range of sources, but it was not well accepted and did not strongly engage students in peer-learning. Although students were able to create complex MCQs, they found some aspects of the writing process burdensome and tended not to trust the quality of each other’s MCQs. Because of the evidence this task did promote deep learning, it is worth continuing this mode of teaching if the task can be made more acceptable to students. BioMed Central 2018-08-22 /pmc/articles/PMC6103861/ /pubmed/30134898 http://dx.doi.org/10.1186/s12909-018-1312-1 Text en © The Author(s). 2018 Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.
spellingShingle Research Article
Grainger, Rebecca
Dai, Wei
Osborne, Emma
Kenwright, Diane
Medical students create multiple-choice questions for learning in pathology education: a pilot study
title Medical students create multiple-choice questions for learning in pathology education: a pilot study
title_full Medical students create multiple-choice questions for learning in pathology education: a pilot study
title_fullStr Medical students create multiple-choice questions for learning in pathology education: a pilot study
title_full_unstemmed Medical students create multiple-choice questions for learning in pathology education: a pilot study
title_short Medical students create multiple-choice questions for learning in pathology education: a pilot study
title_sort medical students create multiple-choice questions for learning in pathology education: a pilot study
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6103861/
https://www.ncbi.nlm.nih.gov/pubmed/30134898
http://dx.doi.org/10.1186/s12909-018-1312-1
work_keys_str_mv AT graingerrebecca medicalstudentscreatemultiplechoicequestionsforlearninginpathologyeducationapilotstudy
AT daiwei medicalstudentscreatemultiplechoicequestionsforlearninginpathologyeducationapilotstudy
AT osborneemma medicalstudentscreatemultiplechoicequestionsforlearninginpathologyeducationapilotstudy
AT kenwrightdiane medicalstudentscreatemultiplechoicequestionsforlearninginpathologyeducationapilotstudy