Cargando…
Process and outcome evaluation of a CBME intervention guided by program theory
RATIONALE: Competency‐based medical education (CBME) has gained momentum as an improved training model, but literature on outcomes of CBME, including evaluation of implementation processes, is minimal. We present a case for the following: (a) the development of a program theory is essential prior to...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
John Wiley & Sons, Inc.
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7496603/ https://www.ncbi.nlm.nih.gov/pubmed/31927788 http://dx.doi.org/10.1111/jep.13344 |
Sumario: | RATIONALE: Competency‐based medical education (CBME) has gained momentum as an improved training model, but literature on outcomes of CBME, including evaluation of implementation processes, is minimal. We present a case for the following: (a) the development of a program theory is essential prior to or in the initial stages of implementation of CBME; (b) the program theory should guide the strategies and methods for evaluation that will answer questions about anticipated and unintended outcomes; and (c) the iterative process of testing assumptions and hypotheses will lead to modifications to the program theory to inform best practices of implementing CBME. METHODS: We use the Triple C Competency‐based Curriculum as a worked example to illustrate how process and outcome evaluation, guided by a program theory, can lead to meaningful enhancement of CBME curriculum, assessment, and implementation strategies. Using a mixed methods design, the processes and outcomes of Triple C were explored through surveys, interviews, and historical document review, which captured the experiences of various stakeholders. FINDINGS: The theory‐led program evaluation process was able to identify areas that supported CBME implementation: the value of a strong nondirective national vertical core supporting the transformation in education, program autonomy, and adaptability to pre‐existing local context. Areas in need of improvement included the need for ongoing support from College of Family Physicians of Canada (CFPC) and better planning for shifts in program leadership over time. CONCLUSIONS: Deliberately pairing evaluation alongside change is an important activity and, when accomplished, yields valuable information from the experiences of those implementing and experiencing a program. Evaluation and the development of an updated program theory facilitate the introduction of new changes and theories that build on these findings, which also supports the desired goal of contributing toward cumulative science rather than “reinventing the wheel.” |
---|