Cargando…
Answering questions in a co‐created formative exam question bank improves summative exam performance, while students perceive benefits from answering, authoring, and peer discussion: A mixed methods analysis of PeerWise
Multiple choice questions (MCQs) are a common form of assessment in medical schools and students seek opportunities to engage with formative assessment that reflects their summative exams. Formative assessment with feedback and active learning strategies improve student learning outcomes, but a chal...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
John Wiley and Sons Inc.
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8311910/ https://www.ncbi.nlm.nih.gov/pubmed/34309243 http://dx.doi.org/10.1002/prp2.833 |
Sumario: | Multiple choice questions (MCQs) are a common form of assessment in medical schools and students seek opportunities to engage with formative assessment that reflects their summative exams. Formative assessment with feedback and active learning strategies improve student learning outcomes, but a challenge for educators, particularly those with large class sizes, is how to provide students with such opportunities without overburdening faculty. To address this, we enrolled medical students in the online learning platform PeerWise, which enables students to author and answer MCQs, rate the quality of other students’ contributions as well as discuss content. A quasi‐experimental mixed methods research design was used to explore PeerWise use and its impact on the learning experience and exam results of fourth year medical students who were studying courses in clinical sciences and pharmacology. Most students chose to engage with PeerWise following its introduction as a noncompulsory learning opportunity. While students perceived benefits in authoring and peer discussion, students engaged most highly with answering questions, noting that this helped them identify gaps in knowledge, test their learning and improve exam technique. Detailed analysis of the 2015 cohort (n = 444) with hierarchical regression models revealed a significant positive predictive relationship between answering PeerWise questions and exam results, even after controlling for previous academic performance, which was further confirmed with a follow‐up multi‐year analysis (2015–2018, n = 1693). These 4 years of quantitative data corroborated students’ belief in the benefit of answering peer‐authored questions for learning. |
---|