Cargando…

Development of an Instructional Design Evaluation Survey for Postgraduate Medical E-Learning: Content Validation Study

BACKGROUND: E-Learning has taken a firm place in postgraduate medical education. Whereas 10 years ago it was promising, it now has a definite niche and is clearly here to stay. However, evaluating the effect of postgraduate medical e-learning (PGMeL) and improving upon it can be complicated. While t...

Descripción completa

Detalles Bibliográficos
Autores principales: de Leeuw, Robert Adrianus, Westerman, Michiel, Walsh, Kieran, Scheele, Fedde
Formato: Online Artículo Texto
Lenguaje:English
Publicado: JMIR Publications 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6713039/
https://www.ncbi.nlm.nih.gov/pubmed/31400102
http://dx.doi.org/10.2196/13921
_version_ 1783446802478923776
author de Leeuw, Robert Adrianus
Westerman, Michiel
Walsh, Kieran
Scheele, Fedde
author_facet de Leeuw, Robert Adrianus
Westerman, Michiel
Walsh, Kieran
Scheele, Fedde
author_sort de Leeuw, Robert Adrianus
collection PubMed
description BACKGROUND: E-Learning has taken a firm place in postgraduate medical education. Whereas 10 years ago it was promising, it now has a definite niche and is clearly here to stay. However, evaluating the effect of postgraduate medical e-learning (PGMeL) and improving upon it can be complicated. While the learning aims of e-learning are evaluated, there are no instruments to evaluate the instructional design of PGMeL. Such an evaluation instrument may be developed by following the Association for Medical Education in Europe (AMEE) 7-step process. The first 5 steps of this process were previously performed by literature reviews, focus group discussion, and an international Delphi study. OBJECTIVE: This study will continue with steps 6 and 7 and answer the research question: Is a content-validated PGMeL evaluation survey useful, understandable, and of added value for creators of e-learning? METHODS: There are five phases in this study: creating a survey from 37 items (phase A); testing readability and question interpretation (phase B); adjusting, rewriting, and translating surveys (phase C); gathering completed surveys from three PGMeL modules (phase D); and holding focus group discussions with the e-learning authors (phase E). Phase E was carried out by presenting the results of the evaluations from phase D, followed by a group discussion. There are four groups of participants in this study. Groups A and B are experienced end users of PGMeL and participated in phase B. Group C are users who undertook e-learning and were asked to complete the survey in phase D. Group D are the authors of the e-learning modules described above. RESULTS: From a list of 36 items, we developed a postgraduate Medical E-Learning Evaluation Survey (MEES). Seven residents participated in the phase B group discussion: 4 items were interpreted differently, 3 were not readable, and 2 items were double. The items from phase B were rewritten and, after adjustment, understood correctly. The MEES was translated into Dutch and again pilot-tested. All items were clear and were understood correctly. The MEES version used for the evaluation contained 3 positive domains (motivation, learning enhancers, and real-world translation) and 2 negative domains (barriers and learning discouragers), with 36 items in those domains, 5 Likert scale questions of 1 to 10, and 5 open questions asking participants to give their own comments in each domain. Three e-learning modules were evaluated from July to November 2018. There were a total of 158 responses from a Dutch module, a European OB/GYN (obstetrics and gynecology) module, and a surgical module offered worldwide. Finally, 3 focus group discussions took place with a total of 10 participants. Usefulness was much appreciated, understandability was good, and added value was high. Four items needed additional explanation by the authors, and a Creators’ Manual was written at their request. CONCLUSIONS: The MEES is the first survey to evaluate the instructional design of PGMeL and was constructed following all 7 steps of the AMEE. This study completes the design of the survey and shows its usefulness and added value to the authors. It finishes with a final, publicly available survey that includes a Creators’ Manual. We briefly discuss the number of responses needed and conclude that more is better; in the end, however, one has to work with what is available. The next steps would be to see whether improvement can be measured by using the MEES and continue to work on the end understandability in different languages and cultural groups.
format Online
Article
Text
id pubmed-6713039
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher JMIR Publications
record_format MEDLINE/PubMed
spelling pubmed-67130392019-08-30 Development of an Instructional Design Evaluation Survey for Postgraduate Medical E-Learning: Content Validation Study de Leeuw, Robert Adrianus Westerman, Michiel Walsh, Kieran Scheele, Fedde J Med Internet Res Original Paper BACKGROUND: E-Learning has taken a firm place in postgraduate medical education. Whereas 10 years ago it was promising, it now has a definite niche and is clearly here to stay. However, evaluating the effect of postgraduate medical e-learning (PGMeL) and improving upon it can be complicated. While the learning aims of e-learning are evaluated, there are no instruments to evaluate the instructional design of PGMeL. Such an evaluation instrument may be developed by following the Association for Medical Education in Europe (AMEE) 7-step process. The first 5 steps of this process were previously performed by literature reviews, focus group discussion, and an international Delphi study. OBJECTIVE: This study will continue with steps 6 and 7 and answer the research question: Is a content-validated PGMeL evaluation survey useful, understandable, and of added value for creators of e-learning? METHODS: There are five phases in this study: creating a survey from 37 items (phase A); testing readability and question interpretation (phase B); adjusting, rewriting, and translating surveys (phase C); gathering completed surveys from three PGMeL modules (phase D); and holding focus group discussions with the e-learning authors (phase E). Phase E was carried out by presenting the results of the evaluations from phase D, followed by a group discussion. There are four groups of participants in this study. Groups A and B are experienced end users of PGMeL and participated in phase B. Group C are users who undertook e-learning and were asked to complete the survey in phase D. Group D are the authors of the e-learning modules described above. RESULTS: From a list of 36 items, we developed a postgraduate Medical E-Learning Evaluation Survey (MEES). Seven residents participated in the phase B group discussion: 4 items were interpreted differently, 3 were not readable, and 2 items were double. The items from phase B were rewritten and, after adjustment, understood correctly. The MEES was translated into Dutch and again pilot-tested. All items were clear and were understood correctly. The MEES version used for the evaluation contained 3 positive domains (motivation, learning enhancers, and real-world translation) and 2 negative domains (barriers and learning discouragers), with 36 items in those domains, 5 Likert scale questions of 1 to 10, and 5 open questions asking participants to give their own comments in each domain. Three e-learning modules were evaluated from July to November 2018. There were a total of 158 responses from a Dutch module, a European OB/GYN (obstetrics and gynecology) module, and a surgical module offered worldwide. Finally, 3 focus group discussions took place with a total of 10 participants. Usefulness was much appreciated, understandability was good, and added value was high. Four items needed additional explanation by the authors, and a Creators’ Manual was written at their request. CONCLUSIONS: The MEES is the first survey to evaluate the instructional design of PGMeL and was constructed following all 7 steps of the AMEE. This study completes the design of the survey and shows its usefulness and added value to the authors. It finishes with a final, publicly available survey that includes a Creators’ Manual. We briefly discuss the number of responses needed and conclude that more is better; in the end, however, one has to work with what is available. The next steps would be to see whether improvement can be measured by using the MEES and continue to work on the end understandability in different languages and cultural groups. JMIR Publications 2019-08-09 /pmc/articles/PMC6713039/ /pubmed/31400102 http://dx.doi.org/10.2196/13921 Text en ©Robert Adrianus de Leeuw, Michiel Westerman, Kieran Walsh, Fedde Scheele. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 09.08.2019. https://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.
spellingShingle Original Paper
de Leeuw, Robert Adrianus
Westerman, Michiel
Walsh, Kieran
Scheele, Fedde
Development of an Instructional Design Evaluation Survey for Postgraduate Medical E-Learning: Content Validation Study
title Development of an Instructional Design Evaluation Survey for Postgraduate Medical E-Learning: Content Validation Study
title_full Development of an Instructional Design Evaluation Survey for Postgraduate Medical E-Learning: Content Validation Study
title_fullStr Development of an Instructional Design Evaluation Survey for Postgraduate Medical E-Learning: Content Validation Study
title_full_unstemmed Development of an Instructional Design Evaluation Survey for Postgraduate Medical E-Learning: Content Validation Study
title_short Development of an Instructional Design Evaluation Survey for Postgraduate Medical E-Learning: Content Validation Study
title_sort development of an instructional design evaluation survey for postgraduate medical e-learning: content validation study
topic Original Paper
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6713039/
https://www.ncbi.nlm.nih.gov/pubmed/31400102
http://dx.doi.org/10.2196/13921
work_keys_str_mv AT deleeuwrobertadrianus developmentofaninstructionaldesignevaluationsurveyforpostgraduatemedicalelearningcontentvalidationstudy
AT westermanmichiel developmentofaninstructionaldesignevaluationsurveyforpostgraduatemedicalelearningcontentvalidationstudy
AT walshkieran developmentofaninstructionaldesignevaluationsurveyforpostgraduatemedicalelearningcontentvalidationstudy
AT scheelefedde developmentofaninstructionaldesignevaluationsurveyforpostgraduatemedicalelearningcontentvalidationstudy