Cargando…

Concordance Between Electronic Clinical Documentation and Physicians’ Observed Behavior

IMPORTANCE: Following the adoption of electronic health records into a regulatory environment designed for paper records, there has been little investigation into the accuracy of physician documentation. OBJECTIVE: To quantify the percentage of emergency physician documentation of the review of syst...

Descripción completa

Detalles Bibliográficos
Autores principales: Berdahl, Carl T., Moran, Gregory J., McBride, Owen, Santini, Alexandra M., Verzhbinsky, Ilya A., Schriger, David L.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: American Medical Association 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6751766/
https://www.ncbi.nlm.nih.gov/pubmed/31532513
http://dx.doi.org/10.1001/jamanetworkopen.2019.11390
_version_ 1783452679246184448
author Berdahl, Carl T.
Moran, Gregory J.
McBride, Owen
Santini, Alexandra M.
Verzhbinsky, Ilya A.
Schriger, David L.
author_facet Berdahl, Carl T.
Moran, Gregory J.
McBride, Owen
Santini, Alexandra M.
Verzhbinsky, Ilya A.
Schriger, David L.
author_sort Berdahl, Carl T.
collection PubMed
description IMPORTANCE: Following the adoption of electronic health records into a regulatory environment designed for paper records, there has been little investigation into the accuracy of physician documentation. OBJECTIVE: To quantify the percentage of emergency physician documentation of the review of systems (ROS) and physical examination (PE) that observers can confirm. DESIGN, SETTING, AND PARTICIPANTS: This case series took place at emergency departments in 2 academic medical centers between 2016 and 2018. Participants’ patient encounters were observed to compare real-time performance with clinical documentation. EXPOSURES: Resident physicians were shadowed by trained observers for 20 encounters (10 encounters per physician per site) to obtain real-time observational data; associated electronic health record data were subsequently reviewed. MAIN OUTCOMES AND MEASURES: Number of confirmed ROS systems (range, 0-14) divided by the number of documented ROS systems (range, 0-14), and number of confirmed PE systems (range, 0-14) divided by the number of documented PE systems (range, 0-14). RESULTS: The final study cohort included 9 licensed emergency medicine residents who evaluated a total of 180 patients (mean [SD] age, 48.7 [20.0] years; 91 [50.5%] women). For ROS, physicians documented a median (interquartile range [IQR]) of 14 (8-14) systems, while audio recordings confirmed a median (IQR) of 5 (3-6) systems. Overall, 755 of 1961 documented ROS systems (38.5%) were confirmed by audio recording data. For PE, resident physicians documented a median (IQR) of 8 (7-9) verifiable systems, while observers confirmed a median (IQR) of 5.5 (3-6) systems. Overall, 760 of 1429 verifiable documented PE systems (53.2%) were confirmed by concurrent observation. Interrater reliability for rating of ROS and PE was more than 90% for all measures. CONCLUSIONS AND RELEVANCE: In this study of 9 licensed year emergency medicine residents, there were inconsistencies between the documentation of ROS and PE findings in the electronic health record and observational reports. These findings raise the possibility that some documentation may not accurately represent physician actions. Further studies should be undertaken to determine whether this occurrence is widespread. However, because such studies are unlikely to be performed owing to institution-level barriers that exist nationwide, payers should consider removing financial incentives to generate lengthy documentation.
format Online
Article
Text
id pubmed-6751766
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher American Medical Association
record_format MEDLINE/PubMed
spelling pubmed-67517662019-10-04 Concordance Between Electronic Clinical Documentation and Physicians’ Observed Behavior Berdahl, Carl T. Moran, Gregory J. McBride, Owen Santini, Alexandra M. Verzhbinsky, Ilya A. Schriger, David L. JAMA Netw Open Original Investigation IMPORTANCE: Following the adoption of electronic health records into a regulatory environment designed for paper records, there has been little investigation into the accuracy of physician documentation. OBJECTIVE: To quantify the percentage of emergency physician documentation of the review of systems (ROS) and physical examination (PE) that observers can confirm. DESIGN, SETTING, AND PARTICIPANTS: This case series took place at emergency departments in 2 academic medical centers between 2016 and 2018. Participants’ patient encounters were observed to compare real-time performance with clinical documentation. EXPOSURES: Resident physicians were shadowed by trained observers for 20 encounters (10 encounters per physician per site) to obtain real-time observational data; associated electronic health record data were subsequently reviewed. MAIN OUTCOMES AND MEASURES: Number of confirmed ROS systems (range, 0-14) divided by the number of documented ROS systems (range, 0-14), and number of confirmed PE systems (range, 0-14) divided by the number of documented PE systems (range, 0-14). RESULTS: The final study cohort included 9 licensed emergency medicine residents who evaluated a total of 180 patients (mean [SD] age, 48.7 [20.0] years; 91 [50.5%] women). For ROS, physicians documented a median (interquartile range [IQR]) of 14 (8-14) systems, while audio recordings confirmed a median (IQR) of 5 (3-6) systems. Overall, 755 of 1961 documented ROS systems (38.5%) were confirmed by audio recording data. For PE, resident physicians documented a median (IQR) of 8 (7-9) verifiable systems, while observers confirmed a median (IQR) of 5.5 (3-6) systems. Overall, 760 of 1429 verifiable documented PE systems (53.2%) were confirmed by concurrent observation. Interrater reliability for rating of ROS and PE was more than 90% for all measures. CONCLUSIONS AND RELEVANCE: In this study of 9 licensed year emergency medicine residents, there were inconsistencies between the documentation of ROS and PE findings in the electronic health record and observational reports. These findings raise the possibility that some documentation may not accurately represent physician actions. Further studies should be undertaken to determine whether this occurrence is widespread. However, because such studies are unlikely to be performed owing to institution-level barriers that exist nationwide, payers should consider removing financial incentives to generate lengthy documentation. American Medical Association 2019-09-18 /pmc/articles/PMC6751766/ /pubmed/31532513 http://dx.doi.org/10.1001/jamanetworkopen.2019.11390 Text en Copyright 2019 Berdahl CT et al. JAMA Network Open. http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the terms of the CC-BY License.
spellingShingle Original Investigation
Berdahl, Carl T.
Moran, Gregory J.
McBride, Owen
Santini, Alexandra M.
Verzhbinsky, Ilya A.
Schriger, David L.
Concordance Between Electronic Clinical Documentation and Physicians’ Observed Behavior
title Concordance Between Electronic Clinical Documentation and Physicians’ Observed Behavior
title_full Concordance Between Electronic Clinical Documentation and Physicians’ Observed Behavior
title_fullStr Concordance Between Electronic Clinical Documentation and Physicians’ Observed Behavior
title_full_unstemmed Concordance Between Electronic Clinical Documentation and Physicians’ Observed Behavior
title_short Concordance Between Electronic Clinical Documentation and Physicians’ Observed Behavior
title_sort concordance between electronic clinical documentation and physicians’ observed behavior
topic Original Investigation
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6751766/
https://www.ncbi.nlm.nih.gov/pubmed/31532513
http://dx.doi.org/10.1001/jamanetworkopen.2019.11390
work_keys_str_mv AT berdahlcarlt concordancebetweenelectronicclinicaldocumentationandphysiciansobservedbehavior
AT morangregoryj concordancebetweenelectronicclinicaldocumentationandphysiciansobservedbehavior
AT mcbrideowen concordancebetweenelectronicclinicaldocumentationandphysiciansobservedbehavior
AT santinialexandram concordancebetweenelectronicclinicaldocumentationandphysiciansobservedbehavior
AT verzhbinskyilyaa concordancebetweenelectronicclinicaldocumentationandphysiciansobservedbehavior
AT schrigerdavidl concordancebetweenelectronicclinicaldocumentationandphysiciansobservedbehavior