Cargando…

Implementation of a multi-level evaluation strategy: a case study on a program for international medical graduates

Evaluation of educational interventions is often focused on immediate and/or short-term metrics associated with knowledge and/or skills acquisition. We developed an educational intervention to support international medical graduates working in rural Victoria. We wanted an evaluation strategy that in...

Descripción completa

Detalles Bibliográficos
Autores principales: Nestel, Debra, Regan, Melanie, Vijayakumar, Priyanga, Sunderji, Irum, Haigh, Cathy, Smith, Cathy, Wright, Alistair
Formato: Online Artículo Texto
Lenguaje:English
Publicado: National Health Personnel Licensing Examination Board of the Republic of Korea 2011
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3258549/
https://www.ncbi.nlm.nih.gov/pubmed/22259672
http://dx.doi.org/10.3352/jeehp.2011.8.13
_version_ 1782221282486517760
author Nestel, Debra
Regan, Melanie
Vijayakumar, Priyanga
Sunderji, Irum
Haigh, Cathy
Smith, Cathy
Wright, Alistair
author_facet Nestel, Debra
Regan, Melanie
Vijayakumar, Priyanga
Sunderji, Irum
Haigh, Cathy
Smith, Cathy
Wright, Alistair
author_sort Nestel, Debra
collection PubMed
description Evaluation of educational interventions is often focused on immediate and/or short-term metrics associated with knowledge and/or skills acquisition. We developed an educational intervention to support international medical graduates working in rural Victoria. We wanted an evaluation strategy that included participants' reactions and considered transfer of learning to the workplace and retention of learning. However, with participants in distributed locations and limited program resources, this was likely to prove challenging. Elsewhere, we have reported the outcomes of this evaluation. In this educational development report, we describe our evaluation strategy as a case study, its underpinning theoretical framework, the strategy, and its benefits and challenges. The strategy sought to address issues of program structure, process, and outcomes. We used a modified version of Kirkpatrick's model as a framework to map our evaluation of participants' experiences, acquisition of knowledge and skills, and their application in the workplace. The predominant benefit was that most of the evaluation instruments allowed for personalization of the program. The baseline instruments provided a broad view of participants' expectations, needs, and current perspective on their role. Immediate evaluation instruments allowed ongoing tailoring of the program to meet learning needs. Intermediate evaluations facilitated insight on the transfer of learning. The principal challenge related to the resource intensive nature of the evaluation strategy. A dedicated program administrator was required to manage data collection. Although resource-intensive, we recommend baseline, immediate, and intermediate data collection points, with multi-source feedback being especially illuminating. We believe our experiences may be valuable to faculty involved in program evaluations.
format Online
Article
Text
id pubmed-3258549
institution National Center for Biotechnology Information
language English
publishDate 2011
publisher National Health Personnel Licensing Examination Board of the Republic of Korea
record_format MEDLINE/PubMed
spelling pubmed-32585492012-01-18 Implementation of a multi-level evaluation strategy: a case study on a program for international medical graduates Nestel, Debra Regan, Melanie Vijayakumar, Priyanga Sunderji, Irum Haigh, Cathy Smith, Cathy Wright, Alistair J Educ Eval Health Prof Educational/Faculty Development Material Evaluation of educational interventions is often focused on immediate and/or short-term metrics associated with knowledge and/or skills acquisition. We developed an educational intervention to support international medical graduates working in rural Victoria. We wanted an evaluation strategy that included participants' reactions and considered transfer of learning to the workplace and retention of learning. However, with participants in distributed locations and limited program resources, this was likely to prove challenging. Elsewhere, we have reported the outcomes of this evaluation. In this educational development report, we describe our evaluation strategy as a case study, its underpinning theoretical framework, the strategy, and its benefits and challenges. The strategy sought to address issues of program structure, process, and outcomes. We used a modified version of Kirkpatrick's model as a framework to map our evaluation of participants' experiences, acquisition of knowledge and skills, and their application in the workplace. The predominant benefit was that most of the evaluation instruments allowed for personalization of the program. The baseline instruments provided a broad view of participants' expectations, needs, and current perspective on their role. Immediate evaluation instruments allowed ongoing tailoring of the program to meet learning needs. Intermediate evaluations facilitated insight on the transfer of learning. The principal challenge related to the resource intensive nature of the evaluation strategy. A dedicated program administrator was required to manage data collection. Although resource-intensive, we recommend baseline, immediate, and intermediate data collection points, with multi-source feedback being especially illuminating. We believe our experiences may be valuable to faculty involved in program evaluations. National Health Personnel Licensing Examination Board of the Republic of Korea 2011-12-17 /pmc/articles/PMC3258549/ /pubmed/22259672 http://dx.doi.org/10.3352/jeehp.2011.8.13 Text en © 2011, National Health Personnel Licensing Examination Board of the Republic of Korea http://creativecommons.org/licenses/by/3.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Educational/Faculty Development Material
Nestel, Debra
Regan, Melanie
Vijayakumar, Priyanga
Sunderji, Irum
Haigh, Cathy
Smith, Cathy
Wright, Alistair
Implementation of a multi-level evaluation strategy: a case study on a program for international medical graduates
title Implementation of a multi-level evaluation strategy: a case study on a program for international medical graduates
title_full Implementation of a multi-level evaluation strategy: a case study on a program for international medical graduates
title_fullStr Implementation of a multi-level evaluation strategy: a case study on a program for international medical graduates
title_full_unstemmed Implementation of a multi-level evaluation strategy: a case study on a program for international medical graduates
title_short Implementation of a multi-level evaluation strategy: a case study on a program for international medical graduates
title_sort implementation of a multi-level evaluation strategy: a case study on a program for international medical graduates
topic Educational/Faculty Development Material
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3258549/
https://www.ncbi.nlm.nih.gov/pubmed/22259672
http://dx.doi.org/10.3352/jeehp.2011.8.13
work_keys_str_mv AT nesteldebra implementationofamultilevelevaluationstrategyacasestudyonaprogramforinternationalmedicalgraduates
AT reganmelanie implementationofamultilevelevaluationstrategyacasestudyonaprogramforinternationalmedicalgraduates
AT vijayakumarpriyanga implementationofamultilevelevaluationstrategyacasestudyonaprogramforinternationalmedicalgraduates
AT sunderjiirum implementationofamultilevelevaluationstrategyacasestudyonaprogramforinternationalmedicalgraduates
AT haighcathy implementationofamultilevelevaluationstrategyacasestudyonaprogramforinternationalmedicalgraduates
AT smithcathy implementationofamultilevelevaluationstrategyacasestudyonaprogramforinternationalmedicalgraduates
AT wrightalistair implementationofamultilevelevaluationstrategyacasestudyonaprogramforinternationalmedicalgraduates