Cargando…

Association of Online Learning Behavior and Learning Outcomes for Medical Students: Large-Scale Usage Data Analysis

BACKGROUND: Digital learning environments have become very common in the training of medical professionals, and students often use such platforms for exam preparation. Multiple choice questions (MCQs) are a common format in medical exams and are used by students to prepare for said exams. OBJECTIVE:...

Descripción completa

Detalles Bibliográficos
Autores principales: Bientzle, Martina, Hircin, Emrah, Kimmerle, Joachim, Knipfer, Christian, Smeets, Ralf, Gaudin, Robert, Holtz, Peter
Formato: Online Artículo Texto
Lenguaje:English
Publicado: JMIR Publications 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6724501/
https://www.ncbi.nlm.nih.gov/pubmed/31436166
http://dx.doi.org/10.2196/13529
_version_ 1783449008086188032
author Bientzle, Martina
Hircin, Emrah
Kimmerle, Joachim
Knipfer, Christian
Smeets, Ralf
Gaudin, Robert
Holtz, Peter
author_facet Bientzle, Martina
Hircin, Emrah
Kimmerle, Joachim
Knipfer, Christian
Smeets, Ralf
Gaudin, Robert
Holtz, Peter
author_sort Bientzle, Martina
collection PubMed
description BACKGROUND: Digital learning environments have become very common in the training of medical professionals, and students often use such platforms for exam preparation. Multiple choice questions (MCQs) are a common format in medical exams and are used by students to prepare for said exams. OBJECTIVE: We aimed to examine whether particular learning activities contributed more strongly than others to users’ exam performance. METHODS: We analyzed data from users of an online platform that provides learning materials for medical students in preparation for their final exams. We analyzed whether the number of learning cards viewed and the number of MCQs taken were positively related to learning outcomes. We also examined whether viewing learning cards or answering MCQs was more effective. Finally, we tested whether taking individual notes predicted learning outcomes, and whether taking notes had an effect after controlling for the effects of learning cards and MCQs. Our analyses from the online platform Amboss are based on user activity data, which supplied the number of learning cards studied and test questions answered. We also included the number of notes from each of those 23,633 users who had studied at least 200 learning cards and had answered at least 1000 test exam questions in the 180 days before their state exam. The activity data for this analysis was collected retrospectively, using Amboss archival usage data from April 2014 to April 2017. Learning outcomes were measured using the final state exam scores that were calculated by using the answers voluntarily entered by the participants. RESULTS: We found correlations between the number of cards studied (r=.22; P<.001) and the number of test questions that had been answered (r=.23; P<.001) with the percentage of correct answers in the learners’ medical exams. The number of test questions answered still yielded a significant effect, even after controlling for the number of learning cards studied using a hierarchical regression analysis (β=.14; P<.001; ΔR(2)=.017; P<.001). We found a negative interaction between the number of learning cards and MCQs, indicating that users with high scores for learning cards and MCQs had the highest exam scores. Those 8040 participants who had taken at least one note had a higher percentage of correct answers (80.94%; SD=7.44) than those who had not taken any notes (78.73%; SD=7.80; t(23631)=20.95; P<.001). In a stepwise regression, the number of notes the participants had taken predicted the percentage of correct answers over and above the effect of the number of learning cards studied and of the number of test questions entered in step one (β=.06; P<.001; ΔR(2)=.004; P<.001). CONCLUSIONS: These results show that online learning platforms are particularly helpful whenever learners engage in active elaboration in learning material, such as by answering MCQs or taking notes.
format Online
Article
Text
id pubmed-6724501
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher JMIR Publications
record_format MEDLINE/PubMed
spelling pubmed-67245012019-09-19 Association of Online Learning Behavior and Learning Outcomes for Medical Students: Large-Scale Usage Data Analysis Bientzle, Martina Hircin, Emrah Kimmerle, Joachim Knipfer, Christian Smeets, Ralf Gaudin, Robert Holtz, Peter JMIR Med Educ Original Paper BACKGROUND: Digital learning environments have become very common in the training of medical professionals, and students often use such platforms for exam preparation. Multiple choice questions (MCQs) are a common format in medical exams and are used by students to prepare for said exams. OBJECTIVE: We aimed to examine whether particular learning activities contributed more strongly than others to users’ exam performance. METHODS: We analyzed data from users of an online platform that provides learning materials for medical students in preparation for their final exams. We analyzed whether the number of learning cards viewed and the number of MCQs taken were positively related to learning outcomes. We also examined whether viewing learning cards or answering MCQs was more effective. Finally, we tested whether taking individual notes predicted learning outcomes, and whether taking notes had an effect after controlling for the effects of learning cards and MCQs. Our analyses from the online platform Amboss are based on user activity data, which supplied the number of learning cards studied and test questions answered. We also included the number of notes from each of those 23,633 users who had studied at least 200 learning cards and had answered at least 1000 test exam questions in the 180 days before their state exam. The activity data for this analysis was collected retrospectively, using Amboss archival usage data from April 2014 to April 2017. Learning outcomes were measured using the final state exam scores that were calculated by using the answers voluntarily entered by the participants. RESULTS: We found correlations between the number of cards studied (r=.22; P<.001) and the number of test questions that had been answered (r=.23; P<.001) with the percentage of correct answers in the learners’ medical exams. The number of test questions answered still yielded a significant effect, even after controlling for the number of learning cards studied using a hierarchical regression analysis (β=.14; P<.001; ΔR(2)=.017; P<.001). We found a negative interaction between the number of learning cards and MCQs, indicating that users with high scores for learning cards and MCQs had the highest exam scores. Those 8040 participants who had taken at least one note had a higher percentage of correct answers (80.94%; SD=7.44) than those who had not taken any notes (78.73%; SD=7.80; t(23631)=20.95; P<.001). In a stepwise regression, the number of notes the participants had taken predicted the percentage of correct answers over and above the effect of the number of learning cards studied and of the number of test questions entered in step one (β=.06; P<.001; ΔR(2)=.004; P<.001). CONCLUSIONS: These results show that online learning platforms are particularly helpful whenever learners engage in active elaboration in learning material, such as by answering MCQs or taking notes. JMIR Publications 2019-08-21 /pmc/articles/PMC6724501/ /pubmed/31436166 http://dx.doi.org/10.2196/13529 Text en ©Martina Bientzle, Emrah Hircin, Joachim Kimmerle, Christian Knipfer, Ralf Smeets, Robert Gaudin, Peter Holtz. Originally published in JMIR Medical Education (http://mededu.jmir.org), 21.08.2019. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Medical Education, is properly cited. The complete bibliographic information, a link to the original publication on http://mededu.jmir.org/, as well as this copyright and license information must be included.
spellingShingle Original Paper
Bientzle, Martina
Hircin, Emrah
Kimmerle, Joachim
Knipfer, Christian
Smeets, Ralf
Gaudin, Robert
Holtz, Peter
Association of Online Learning Behavior and Learning Outcomes for Medical Students: Large-Scale Usage Data Analysis
title Association of Online Learning Behavior and Learning Outcomes for Medical Students: Large-Scale Usage Data Analysis
title_full Association of Online Learning Behavior and Learning Outcomes for Medical Students: Large-Scale Usage Data Analysis
title_fullStr Association of Online Learning Behavior and Learning Outcomes for Medical Students: Large-Scale Usage Data Analysis
title_full_unstemmed Association of Online Learning Behavior and Learning Outcomes for Medical Students: Large-Scale Usage Data Analysis
title_short Association of Online Learning Behavior and Learning Outcomes for Medical Students: Large-Scale Usage Data Analysis
title_sort association of online learning behavior and learning outcomes for medical students: large-scale usage data analysis
topic Original Paper
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6724501/
https://www.ncbi.nlm.nih.gov/pubmed/31436166
http://dx.doi.org/10.2196/13529
work_keys_str_mv AT bientzlemartina associationofonlinelearningbehaviorandlearningoutcomesformedicalstudentslargescaleusagedataanalysis
AT hircinemrah associationofonlinelearningbehaviorandlearningoutcomesformedicalstudentslargescaleusagedataanalysis
AT kimmerlejoachim associationofonlinelearningbehaviorandlearningoutcomesformedicalstudentslargescaleusagedataanalysis
AT knipferchristian associationofonlinelearningbehaviorandlearningoutcomesformedicalstudentslargescaleusagedataanalysis
AT smeetsralf associationofonlinelearningbehaviorandlearningoutcomesformedicalstudentslargescaleusagedataanalysis
AT gaudinrobert associationofonlinelearningbehaviorandlearningoutcomesformedicalstudentslargescaleusagedataanalysis
AT holtzpeter associationofonlinelearningbehaviorandlearningoutcomesformedicalstudentslargescaleusagedataanalysis