Cargando…
A national training program for simulation educators and technicians: evaluation strategy and outcomes
BACKGROUND: Simulation-based education (SBE) has seen a dramatic uptake in health professions education over the last decade. SBE offers learning opportunities that are difficult to access by other methods. Competent faculty is seen as key to high quality SBE. In 2011, in response to a significant n...
Autores principales: | , , , , , , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
BioMed Central
2016
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4722779/ https://www.ncbi.nlm.nih.gov/pubmed/26796786 http://dx.doi.org/10.1186/s12909-016-0548-x |
Sumario: | BACKGROUND: Simulation-based education (SBE) has seen a dramatic uptake in health professions education over the last decade. SBE offers learning opportunities that are difficult to access by other methods. Competent faculty is seen as key to high quality SBE. In 2011, in response to a significant national healthcare issue – the need to enhance the quality and scale of SBE - a group of Australian universities was commissioned to develop a national training program - Australian Simulation Educator and Technician Training (AusSETT) Program. This paper reports the evaluation of this large-scale initiative. METHODS: The AusSETT Program adopted a train-the-trainer model, which offered up to three days of workshops and between four and eight hours of e-learning. The Program was offered across all professions in all states and territories. Three hundred and three participants attended workshops with 230 also completing e-learning modules. Topics included: foundational learning theory; orientation to diverse simulation modalities; briefing; and debriefing. A layered objectives-oriented evaluation strategy was adopted with multiple stakeholders (participants, external experts), methods of data collection (end of module evaluations, workshop observer reports and individual interviews) and at multiple data points (immediate and two months later). Descriptive statistics were used to analyse numerical data while textual data (written comments and transcripts of interviews) underwent content or thematic analysis. RESULTS: For each module, between 45 and 254 participants completed evaluations. The content and educational methods were rated highly with items exceeding the pre-established standard. In written evaluations, participants identified strengths (e.g. high quality facilitation, breadth and depth of content) and areas for development (e.g. electronic portfolio, learning management system) of the Program. Interviews with participants suggested the Program had positively impacted their educational practices. Observers reported a high quality educational experience for participants with alignment of content and methods with perceived participant needs. CONCLUSIONS: The AusSETT Program is a significant and enduring learning resource. The development of a national training program to support a competent simulation workforce is feasible. The Program objectives were largely met. Although there are limitations with the study design (e.g. self-report), there are strengths such as exploring the impact two months later. The evaluation of the Program informs the next phase of the national strategy for simulation educators and technicians with respect to content and processes, strengths and areas for development. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1186/s12909-016-0548-x) contains supplementary material, which is available to authorized users. |
---|