Cargando…

Assessment of a Simulated Case-Based Measurement of Physician Diagnostic Performance

IMPORTANCE: Diagnostic acumen is a fundamental skill in the practice of medicine. Scalable, practical, and objective tools to assess diagnostic performance are lacking. OBJECTIVE: To validate a new method of assessing diagnostic performance that uses automated techniques to assess physicians’ diagno...

Descripción completa

Detalles Bibliográficos
Autores principales: Chatterjee, Souvik, Desai, Sanjay, Manesh, Reza, Sun, Junfeng, Nundy, Shantanu, Wright, Scott M.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: American Medical Association 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6484555/
https://www.ncbi.nlm.nih.gov/pubmed/30646211
http://dx.doi.org/10.1001/jamanetworkopen.2018.7006
_version_ 1783414134506782720
author Chatterjee, Souvik
Desai, Sanjay
Manesh, Reza
Sun, Junfeng
Nundy, Shantanu
Wright, Scott M.
author_facet Chatterjee, Souvik
Desai, Sanjay
Manesh, Reza
Sun, Junfeng
Nundy, Shantanu
Wright, Scott M.
author_sort Chatterjee, Souvik
collection PubMed
description IMPORTANCE: Diagnostic acumen is a fundamental skill in the practice of medicine. Scalable, practical, and objective tools to assess diagnostic performance are lacking. OBJECTIVE: To validate a new method of assessing diagnostic performance that uses automated techniques to assess physicians’ diagnostic performance on brief, open-ended case simulations. DESIGN, SETTING, AND PARTICIPANTS: Retrospective cohort study of 11 023 unique attempts to solve case simulations on an online software platform, The Human Diagnosis Project (Human Dx). A total of 1738 practicing physicians, residents (internal medicine, family medicine, and emergency medicine), and medical students throughout the United States voluntarily used Human Dx software between January 21, 2016, and January 15, 2017. MAIN OUTCOMES AND MEASURES: Internal structure validity was assessed by 3 measures of diagnostic performance: accuracy, efficiency, and a combined score (Diagnostic Acumen Precision Performance [DAPP]). These were each analyzed by level of training. Association with other variables’ validity evidence was evaluated by correlating diagnostic performance and affiliation with an institution ranked in the top 25 medical schools by US News and World Report. RESULTS: Data were analyzed for 239 attending physicians, 926 resident physicians, 347 intern physicians, and 226 medical students. Attending physicians had higher mean accuracy scores than medical students (difference, 8.1; 95% CI, 4.2-12.0; P < .001), as did residents (difference, 8.0; 95% CI, 4.8-11.2; P < .001) and interns (difference, 5.9; 95% CI, 2.3-9.6; P < .001). Attending physicians had higher mean efficiency compared with residents (difference, 4.8; 95% CI, 1.8-7.8; P < .001), interns (difference, 5.0; 95% CI, 1.5-8.4; P = .001), and medical students (difference, 5.4; 95% CI, 1.4-9.3; P = .003). Attending physicians also had significantly higher mean DAPP scores than residents (difference, 2.6; 95% CI, 0.0-5.2; P = .05), interns (difference, 3.6; 95% CI, 0.6-6.6; P = .01), and medical students (difference, 6.7; 95% CI, 3.3-10.2; P < .001). Attending physicians affiliated with a US News and World Report–ranked institution had higher mean DAPP scores compared with nonaffiliated attending physicians (80 [95% CI, 77-83] vs 72 [95% CI, 70-74], respectively; P < .001). Resident physicians affiliated with an institution ranked in the top 25 medical schools by US News and World Report also had higher mean DAPP scores compared with nonaffiliated peers (75 [95% CI, 73-77] vs 71 [95% CI, 69-72], respectively; P < .001). CONCLUSIONS AND RELEVANCE: The data suggest that diagnostic performance is higher in those with more training and that DAPP scores may be a valid measure to appraise diagnostic performance. This diagnostic assessment tool allows individuals to receive immediate feedback on performance through an openly accessible online platform.
format Online
Article
Text
id pubmed-6484555
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher American Medical Association
record_format MEDLINE/PubMed
spelling pubmed-64845552019-05-21 Assessment of a Simulated Case-Based Measurement of Physician Diagnostic Performance Chatterjee, Souvik Desai, Sanjay Manesh, Reza Sun, Junfeng Nundy, Shantanu Wright, Scott M. JAMA Netw Open Original Investigation IMPORTANCE: Diagnostic acumen is a fundamental skill in the practice of medicine. Scalable, practical, and objective tools to assess diagnostic performance are lacking. OBJECTIVE: To validate a new method of assessing diagnostic performance that uses automated techniques to assess physicians’ diagnostic performance on brief, open-ended case simulations. DESIGN, SETTING, AND PARTICIPANTS: Retrospective cohort study of 11 023 unique attempts to solve case simulations on an online software platform, The Human Diagnosis Project (Human Dx). A total of 1738 practicing physicians, residents (internal medicine, family medicine, and emergency medicine), and medical students throughout the United States voluntarily used Human Dx software between January 21, 2016, and January 15, 2017. MAIN OUTCOMES AND MEASURES: Internal structure validity was assessed by 3 measures of diagnostic performance: accuracy, efficiency, and a combined score (Diagnostic Acumen Precision Performance [DAPP]). These were each analyzed by level of training. Association with other variables’ validity evidence was evaluated by correlating diagnostic performance and affiliation with an institution ranked in the top 25 medical schools by US News and World Report. RESULTS: Data were analyzed for 239 attending physicians, 926 resident physicians, 347 intern physicians, and 226 medical students. Attending physicians had higher mean accuracy scores than medical students (difference, 8.1; 95% CI, 4.2-12.0; P < .001), as did residents (difference, 8.0; 95% CI, 4.8-11.2; P < .001) and interns (difference, 5.9; 95% CI, 2.3-9.6; P < .001). Attending physicians had higher mean efficiency compared with residents (difference, 4.8; 95% CI, 1.8-7.8; P < .001), interns (difference, 5.0; 95% CI, 1.5-8.4; P = .001), and medical students (difference, 5.4; 95% CI, 1.4-9.3; P = .003). Attending physicians also had significantly higher mean DAPP scores than residents (difference, 2.6; 95% CI, 0.0-5.2; P = .05), interns (difference, 3.6; 95% CI, 0.6-6.6; P = .01), and medical students (difference, 6.7; 95% CI, 3.3-10.2; P < .001). Attending physicians affiliated with a US News and World Report–ranked institution had higher mean DAPP scores compared with nonaffiliated attending physicians (80 [95% CI, 77-83] vs 72 [95% CI, 70-74], respectively; P < .001). Resident physicians affiliated with an institution ranked in the top 25 medical schools by US News and World Report also had higher mean DAPP scores compared with nonaffiliated peers (75 [95% CI, 73-77] vs 71 [95% CI, 69-72], respectively; P < .001). CONCLUSIONS AND RELEVANCE: The data suggest that diagnostic performance is higher in those with more training and that DAPP scores may be a valid measure to appraise diagnostic performance. This diagnostic assessment tool allows individuals to receive immediate feedback on performance through an openly accessible online platform. American Medical Association 2019-01-11 /pmc/articles/PMC6484555/ /pubmed/30646211 http://dx.doi.org/10.1001/jamanetworkopen.2018.7006 Text en Copyright 2019 Chatterjee S et al. JAMA Network Open. http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the terms of the CC-BY License.
spellingShingle Original Investigation
Chatterjee, Souvik
Desai, Sanjay
Manesh, Reza
Sun, Junfeng
Nundy, Shantanu
Wright, Scott M.
Assessment of a Simulated Case-Based Measurement of Physician Diagnostic Performance
title Assessment of a Simulated Case-Based Measurement of Physician Diagnostic Performance
title_full Assessment of a Simulated Case-Based Measurement of Physician Diagnostic Performance
title_fullStr Assessment of a Simulated Case-Based Measurement of Physician Diagnostic Performance
title_full_unstemmed Assessment of a Simulated Case-Based Measurement of Physician Diagnostic Performance
title_short Assessment of a Simulated Case-Based Measurement of Physician Diagnostic Performance
title_sort assessment of a simulated case-based measurement of physician diagnostic performance
topic Original Investigation
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6484555/
https://www.ncbi.nlm.nih.gov/pubmed/30646211
http://dx.doi.org/10.1001/jamanetworkopen.2018.7006
work_keys_str_mv AT chatterjeesouvik assessmentofasimulatedcasebasedmeasurementofphysiciandiagnosticperformance
AT desaisanjay assessmentofasimulatedcasebasedmeasurementofphysiciandiagnosticperformance
AT maneshreza assessmentofasimulatedcasebasedmeasurementofphysiciandiagnosticperformance
AT sunjunfeng assessmentofasimulatedcasebasedmeasurementofphysiciandiagnosticperformance
AT nundyshantanu assessmentofasimulatedcasebasedmeasurementofphysiciandiagnosticperformance
AT wrightscottm assessmentofasimulatedcasebasedmeasurementofphysiciandiagnosticperformance