Cargando…
A national training program for simulation educators and technicians: evaluation strategy and outcomes
BACKGROUND: Simulation-based education (SBE) has seen a dramatic uptake in health professions education over the last decade. SBE offers learning opportunities that are difficult to access by other methods. Competent faculty is seen as key to high quality SBE. In 2011, in response to a significant n...
Autores principales: | , , , , , , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
BioMed Central
2016
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4722779/ https://www.ncbi.nlm.nih.gov/pubmed/26796786 http://dx.doi.org/10.1186/s12909-016-0548-x |
_version_ | 1782411416805834752 |
---|---|
author | Nestel, Debra Bearman, Margaret Brooks, Peter Campher, Dylan Freeman, Kirsty Greenhill, Jennene Jolly, Brian Rogers, Leanne Rudd, Cobie Sprick, Cyle Sutton, Beverley Harlim, Jennifer Watson, Marcus |
author_facet | Nestel, Debra Bearman, Margaret Brooks, Peter Campher, Dylan Freeman, Kirsty Greenhill, Jennene Jolly, Brian Rogers, Leanne Rudd, Cobie Sprick, Cyle Sutton, Beverley Harlim, Jennifer Watson, Marcus |
author_sort | Nestel, Debra |
collection | PubMed |
description | BACKGROUND: Simulation-based education (SBE) has seen a dramatic uptake in health professions education over the last decade. SBE offers learning opportunities that are difficult to access by other methods. Competent faculty is seen as key to high quality SBE. In 2011, in response to a significant national healthcare issue – the need to enhance the quality and scale of SBE - a group of Australian universities was commissioned to develop a national training program - Australian Simulation Educator and Technician Training (AusSETT) Program. This paper reports the evaluation of this large-scale initiative. METHODS: The AusSETT Program adopted a train-the-trainer model, which offered up to three days of workshops and between four and eight hours of e-learning. The Program was offered across all professions in all states and territories. Three hundred and three participants attended workshops with 230 also completing e-learning modules. Topics included: foundational learning theory; orientation to diverse simulation modalities; briefing; and debriefing. A layered objectives-oriented evaluation strategy was adopted with multiple stakeholders (participants, external experts), methods of data collection (end of module evaluations, workshop observer reports and individual interviews) and at multiple data points (immediate and two months later). Descriptive statistics were used to analyse numerical data while textual data (written comments and transcripts of interviews) underwent content or thematic analysis. RESULTS: For each module, between 45 and 254 participants completed evaluations. The content and educational methods were rated highly with items exceeding the pre-established standard. In written evaluations, participants identified strengths (e.g. high quality facilitation, breadth and depth of content) and areas for development (e.g. electronic portfolio, learning management system) of the Program. Interviews with participants suggested the Program had positively impacted their educational practices. Observers reported a high quality educational experience for participants with alignment of content and methods with perceived participant needs. CONCLUSIONS: The AusSETT Program is a significant and enduring learning resource. The development of a national training program to support a competent simulation workforce is feasible. The Program objectives were largely met. Although there are limitations with the study design (e.g. self-report), there are strengths such as exploring the impact two months later. The evaluation of the Program informs the next phase of the national strategy for simulation educators and technicians with respect to content and processes, strengths and areas for development. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1186/s12909-016-0548-x) contains supplementary material, which is available to authorized users. |
format | Online Article Text |
id | pubmed-4722779 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2016 |
publisher | BioMed Central |
record_format | MEDLINE/PubMed |
spelling | pubmed-47227792016-01-23 A national training program for simulation educators and technicians: evaluation strategy and outcomes Nestel, Debra Bearman, Margaret Brooks, Peter Campher, Dylan Freeman, Kirsty Greenhill, Jennene Jolly, Brian Rogers, Leanne Rudd, Cobie Sprick, Cyle Sutton, Beverley Harlim, Jennifer Watson, Marcus BMC Med Educ Research Article BACKGROUND: Simulation-based education (SBE) has seen a dramatic uptake in health professions education over the last decade. SBE offers learning opportunities that are difficult to access by other methods. Competent faculty is seen as key to high quality SBE. In 2011, in response to a significant national healthcare issue – the need to enhance the quality and scale of SBE - a group of Australian universities was commissioned to develop a national training program - Australian Simulation Educator and Technician Training (AusSETT) Program. This paper reports the evaluation of this large-scale initiative. METHODS: The AusSETT Program adopted a train-the-trainer model, which offered up to three days of workshops and between four and eight hours of e-learning. The Program was offered across all professions in all states and territories. Three hundred and three participants attended workshops with 230 also completing e-learning modules. Topics included: foundational learning theory; orientation to diverse simulation modalities; briefing; and debriefing. A layered objectives-oriented evaluation strategy was adopted with multiple stakeholders (participants, external experts), methods of data collection (end of module evaluations, workshop observer reports and individual interviews) and at multiple data points (immediate and two months later). Descriptive statistics were used to analyse numerical data while textual data (written comments and transcripts of interviews) underwent content or thematic analysis. RESULTS: For each module, between 45 and 254 participants completed evaluations. The content and educational methods were rated highly with items exceeding the pre-established standard. In written evaluations, participants identified strengths (e.g. high quality facilitation, breadth and depth of content) and areas for development (e.g. electronic portfolio, learning management system) of the Program. Interviews with participants suggested the Program had positively impacted their educational practices. Observers reported a high quality educational experience for participants with alignment of content and methods with perceived participant needs. CONCLUSIONS: The AusSETT Program is a significant and enduring learning resource. The development of a national training program to support a competent simulation workforce is feasible. The Program objectives were largely met. Although there are limitations with the study design (e.g. self-report), there are strengths such as exploring the impact two months later. The evaluation of the Program informs the next phase of the national strategy for simulation educators and technicians with respect to content and processes, strengths and areas for development. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1186/s12909-016-0548-x) contains supplementary material, which is available to authorized users. BioMed Central 2016-01-22 /pmc/articles/PMC4722779/ /pubmed/26796786 http://dx.doi.org/10.1186/s12909-016-0548-x Text en © Nestel et al. 2016 Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated. |
spellingShingle | Research Article Nestel, Debra Bearman, Margaret Brooks, Peter Campher, Dylan Freeman, Kirsty Greenhill, Jennene Jolly, Brian Rogers, Leanne Rudd, Cobie Sprick, Cyle Sutton, Beverley Harlim, Jennifer Watson, Marcus A national training program for simulation educators and technicians: evaluation strategy and outcomes |
title | A national training program for simulation educators and technicians: evaluation strategy and outcomes |
title_full | A national training program for simulation educators and technicians: evaluation strategy and outcomes |
title_fullStr | A national training program for simulation educators and technicians: evaluation strategy and outcomes |
title_full_unstemmed | A national training program for simulation educators and technicians: evaluation strategy and outcomes |
title_short | A national training program for simulation educators and technicians: evaluation strategy and outcomes |
title_sort | national training program for simulation educators and technicians: evaluation strategy and outcomes |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4722779/ https://www.ncbi.nlm.nih.gov/pubmed/26796786 http://dx.doi.org/10.1186/s12909-016-0548-x |
work_keys_str_mv | AT nesteldebra anationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT bearmanmargaret anationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT brookspeter anationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT campherdylan anationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT freemankirsty anationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT greenhilljennene anationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT jollybrian anationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT rogersleanne anationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT ruddcobie anationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT sprickcyle anationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT suttonbeverley anationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT harlimjennifer anationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT watsonmarcus anationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT nesteldebra nationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT bearmanmargaret nationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT brookspeter nationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT campherdylan nationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT freemankirsty nationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT greenhilljennene nationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT jollybrian nationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT rogersleanne nationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT ruddcobie nationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT sprickcyle nationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT suttonbeverley nationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT harlimjennifer nationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes AT watsonmarcus nationaltrainingprogramforsimulationeducatorsandtechniciansevaluationstrategyandoutcomes |