Cargando…

Assessment of Emergency Medicine Residents’ Clinical Reasoning: Validation of a Script Concordance Test

INTRODUCTION: A primary aim of residency training is to develop competence in clinical reasoning. However, there are few instruments that can accurately, reliably, and efficiently assess residents’ clinical decision-making ability. This study aimed to externally validate the script concordance test...

Descripción completa

Detalles Bibliográficos
Autores principales: Steinberg, Eric, Cowan, Ethan, Lin, Michelle P., Sielicki, Anthony, Warrington, Steven
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Department of Emergency Medicine, University of California, Irvine School of Medicine 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7390545/
https://www.ncbi.nlm.nih.gov/pubmed/32726273
http://dx.doi.org/10.5811/westjem.2020.3.46035
_version_ 1783564471465148416
author Steinberg, Eric
Cowan, Ethan
Lin, Michelle P.
Sielicki, Anthony
Warrington, Steven
author_facet Steinberg, Eric
Cowan, Ethan
Lin, Michelle P.
Sielicki, Anthony
Warrington, Steven
author_sort Steinberg, Eric
collection PubMed
description INTRODUCTION: A primary aim of residency training is to develop competence in clinical reasoning. However, there are few instruments that can accurately, reliably, and efficiently assess residents’ clinical decision-making ability. This study aimed to externally validate the script concordance test in emergency medicine (SCT-EM), an assessment tool designed for this purpose. METHODS: Using established methodology for the SCT-EM, we compared EM residents’ performance on the SCT-EM to an expert panel of emergency physicians at three urban academic centers. We performed adjusted pairwise t-tests to compare differences between all residents and attending physicians, as well as among resident postgraduate year (PGY) levels. We tested correlation between SCT-EM and Accreditation Council for Graduate Medical Education Milestone scores using Pearson’s correlation coefficients. Inter-item covariances for SCT items were calculated using Cronbach’s alpha statistic. RESULTS: The SCT-EM was administered to 68 residents and 13 attendings. There was a significant difference in mean scores among all groups (mean + standard deviation: PGY-1 59 + 7; PGY-2 62 + 6; PGY-3 60 + 8; PGY-4 61 + 8; 73 + 8 for attendings, p < 0.01). Post hoc pairwise comparisons demonstrated that significant difference in mean scores only occurred between each PGY level and the attendings (p < 0.01 for PGY-1 to PGY-4 vs attending group). Performance on the SCT-EM and EM Milestones was not significantly correlated (r = 0.12, p = 0.35). Internal reliability of the exam was determined using Cronbach’s alpha, which was 0.67 for all examinees, and 0.89 in the expert-only group. CONCLUSION: The SCT-EM has limited utility in reliably assessing clinical reasoning among EM residents. Although the SCT-EM was able to differentiate clinical reasoning ability between residents and expert faculty, it did not between PGY levels, or correlate with Milestones scores. Furthermore, several limitations threaten the validity of the SCT-EM, suggesting further study is needed in more diverse settings.
format Online
Article
Text
id pubmed-7390545
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Department of Emergency Medicine, University of California, Irvine School of Medicine
record_format MEDLINE/PubMed
spelling pubmed-73905452020-07-31 Assessment of Emergency Medicine Residents’ Clinical Reasoning: Validation of a Script Concordance Test Steinberg, Eric Cowan, Ethan Lin, Michelle P. Sielicki, Anthony Warrington, Steven West J Emerg Med Education INTRODUCTION: A primary aim of residency training is to develop competence in clinical reasoning. However, there are few instruments that can accurately, reliably, and efficiently assess residents’ clinical decision-making ability. This study aimed to externally validate the script concordance test in emergency medicine (SCT-EM), an assessment tool designed for this purpose. METHODS: Using established methodology for the SCT-EM, we compared EM residents’ performance on the SCT-EM to an expert panel of emergency physicians at three urban academic centers. We performed adjusted pairwise t-tests to compare differences between all residents and attending physicians, as well as among resident postgraduate year (PGY) levels. We tested correlation between SCT-EM and Accreditation Council for Graduate Medical Education Milestone scores using Pearson’s correlation coefficients. Inter-item covariances for SCT items were calculated using Cronbach’s alpha statistic. RESULTS: The SCT-EM was administered to 68 residents and 13 attendings. There was a significant difference in mean scores among all groups (mean + standard deviation: PGY-1 59 + 7; PGY-2 62 + 6; PGY-3 60 + 8; PGY-4 61 + 8; 73 + 8 for attendings, p < 0.01). Post hoc pairwise comparisons demonstrated that significant difference in mean scores only occurred between each PGY level and the attendings (p < 0.01 for PGY-1 to PGY-4 vs attending group). Performance on the SCT-EM and EM Milestones was not significantly correlated (r = 0.12, p = 0.35). Internal reliability of the exam was determined using Cronbach’s alpha, which was 0.67 for all examinees, and 0.89 in the expert-only group. CONCLUSION: The SCT-EM has limited utility in reliably assessing clinical reasoning among EM residents. Although the SCT-EM was able to differentiate clinical reasoning ability between residents and expert faculty, it did not between PGY levels, or correlate with Milestones scores. Furthermore, several limitations threaten the validity of the SCT-EM, suggesting further study is needed in more diverse settings. Department of Emergency Medicine, University of California, Irvine School of Medicine 2020-07 2020-06-24 /pmc/articles/PMC7390545/ /pubmed/32726273 http://dx.doi.org/10.5811/westjem.2020.3.46035 Text en Copyright: © 2020 Steinberg et al. http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed in accordance with the terms of the Creative Commons Attribution (CC BY 4.0) License. See: http://creativecommons.org/licenses/by/4.0/
spellingShingle Education
Steinberg, Eric
Cowan, Ethan
Lin, Michelle P.
Sielicki, Anthony
Warrington, Steven
Assessment of Emergency Medicine Residents’ Clinical Reasoning: Validation of a Script Concordance Test
title Assessment of Emergency Medicine Residents’ Clinical Reasoning: Validation of a Script Concordance Test
title_full Assessment of Emergency Medicine Residents’ Clinical Reasoning: Validation of a Script Concordance Test
title_fullStr Assessment of Emergency Medicine Residents’ Clinical Reasoning: Validation of a Script Concordance Test
title_full_unstemmed Assessment of Emergency Medicine Residents’ Clinical Reasoning: Validation of a Script Concordance Test
title_short Assessment of Emergency Medicine Residents’ Clinical Reasoning: Validation of a Script Concordance Test
title_sort assessment of emergency medicine residents’ clinical reasoning: validation of a script concordance test
topic Education
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7390545/
https://www.ncbi.nlm.nih.gov/pubmed/32726273
http://dx.doi.org/10.5811/westjem.2020.3.46035
work_keys_str_mv AT steinbergeric assessmentofemergencymedicineresidentsclinicalreasoningvalidationofascriptconcordancetest
AT cowanethan assessmentofemergencymedicineresidentsclinicalreasoningvalidationofascriptconcordancetest
AT linmichellep assessmentofemergencymedicineresidentsclinicalreasoningvalidationofascriptconcordancetest
AT sielickianthony assessmentofemergencymedicineresidentsclinicalreasoningvalidationofascriptconcordancetest
AT warringtonsteven assessmentofemergencymedicineresidentsclinicalreasoningvalidationofascriptconcordancetest