Cargando…

Usability Testing of a Complex Clinical Decision Support Tool in the Emergency Department: Lessons Learned

BACKGROUND: As the electronic health record (EHR) becomes the preferred documentation tool across medical practices, health care organizations are pushing for clinical decision support systems (CDSS) to help bring clinical decision support (CDS) tools to the forefront of patient-physician interactio...

Descripción completa

Detalles Bibliográficos
Autores principales: Press, Anne, McCullagh, Lauren, Khan, Sundas, Schachter, Andy, Pardo, Salvatore, McGinn, Thomas
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Gunther Eysenbach 2015
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4797671/
https://www.ncbi.nlm.nih.gov/pubmed/27025540
http://dx.doi.org/10.2196/humanfactors.4537
_version_ 1782422000578330624
author Press, Anne
McCullagh, Lauren
Khan, Sundas
Schachter, Andy
Pardo, Salvatore
McGinn, Thomas
author_facet Press, Anne
McCullagh, Lauren
Khan, Sundas
Schachter, Andy
Pardo, Salvatore
McGinn, Thomas
author_sort Press, Anne
collection PubMed
description BACKGROUND: As the electronic health record (EHR) becomes the preferred documentation tool across medical practices, health care organizations are pushing for clinical decision support systems (CDSS) to help bring clinical decision support (CDS) tools to the forefront of patient-physician interactions. A CDSS is integrated into the EHR and allows physicians to easily utilize CDS tools. However, often CDSS are integrated into the EHR without an initial phase of usability testing, resulting in poor adoption rates. Usability testing is important because it evaluates a CDSS by testing it on actual users. This paper outlines the usability phase of a study, which will test the impact of integration of the Wells CDSS for pulmonary embolism (PE) diagnosis into a large urban emergency department, where workflow is often chaotic and high stakes decisions are frequently made. We hypothesize that conducting usability testing prior to integration of the Wells score into an emergency room EHR will result in increased adoption rates by physicians. OBJECTIVE: The objective of the study was to conduct usability testing for the integration of the Wells clinical prediction rule into a tertiary care center’s emergency department EHR. METHODS: We conducted usability testing of a CDS tool in the emergency department EHR. The CDS tool consisted of the Wells rule for PE in the form of a calculator and was triggered off computed tomography (CT) orders or patients’ chief complaint. The study was conducted at a tertiary hospital in Queens, New York. There were seven residents that were recruited and participated in two phases of usability testing. The usability testing employed a “think aloud” method and “near-live” clinical simulation, where care providers interacted with standardized patients enacting a clinical scenario. Both phases were audiotaped, video-taped, and had screen-capture software activated for onscreen recordings. RESULTS: Phase I: Data from the “think-aloud” phase of the study showed an overall positive outlook on the Wells tool in assessing a patient for a PE diagnosis. Subjects described the tool as “well-organized” and “better than clinical judgment”. Changes were made to improve tool placement into the EHR to make it optimal for decision-making, auto-populating boxes, and minimizing click fatigue. Phase II: After incorporating the changes noted in Phase 1, the participants noted tool improvements. There was less toggling between screens, they had all the clinical information required to complete the tool, and were able to complete the patient visit efficiently. However, an optimal location for triggering the tool remained controversial. CONCLUSIONS: This study successfully combined “think-aloud” protocol analysis with “near-live” clinical simulations in a usability evaluation of a CDS tool that will be implemented into the emergency room environment. Both methods proved useful in the assessment of the CDS tool and allowed us to refine tool usability and workflow.
format Online
Article
Text
id pubmed-4797671
institution National Center for Biotechnology Information
language English
publishDate 2015
publisher Gunther Eysenbach
record_format MEDLINE/PubMed
spelling pubmed-47976712016-03-23 Usability Testing of a Complex Clinical Decision Support Tool in the Emergency Department: Lessons Learned Press, Anne McCullagh, Lauren Khan, Sundas Schachter, Andy Pardo, Salvatore McGinn, Thomas JMIR Hum Factors Original Paper BACKGROUND: As the electronic health record (EHR) becomes the preferred documentation tool across medical practices, health care organizations are pushing for clinical decision support systems (CDSS) to help bring clinical decision support (CDS) tools to the forefront of patient-physician interactions. A CDSS is integrated into the EHR and allows physicians to easily utilize CDS tools. However, often CDSS are integrated into the EHR without an initial phase of usability testing, resulting in poor adoption rates. Usability testing is important because it evaluates a CDSS by testing it on actual users. This paper outlines the usability phase of a study, which will test the impact of integration of the Wells CDSS for pulmonary embolism (PE) diagnosis into a large urban emergency department, where workflow is often chaotic and high stakes decisions are frequently made. We hypothesize that conducting usability testing prior to integration of the Wells score into an emergency room EHR will result in increased adoption rates by physicians. OBJECTIVE: The objective of the study was to conduct usability testing for the integration of the Wells clinical prediction rule into a tertiary care center’s emergency department EHR. METHODS: We conducted usability testing of a CDS tool in the emergency department EHR. The CDS tool consisted of the Wells rule for PE in the form of a calculator and was triggered off computed tomography (CT) orders or patients’ chief complaint. The study was conducted at a tertiary hospital in Queens, New York. There were seven residents that were recruited and participated in two phases of usability testing. The usability testing employed a “think aloud” method and “near-live” clinical simulation, where care providers interacted with standardized patients enacting a clinical scenario. Both phases were audiotaped, video-taped, and had screen-capture software activated for onscreen recordings. RESULTS: Phase I: Data from the “think-aloud” phase of the study showed an overall positive outlook on the Wells tool in assessing a patient for a PE diagnosis. Subjects described the tool as “well-organized” and “better than clinical judgment”. Changes were made to improve tool placement into the EHR to make it optimal for decision-making, auto-populating boxes, and minimizing click fatigue. Phase II: After incorporating the changes noted in Phase 1, the participants noted tool improvements. There was less toggling between screens, they had all the clinical information required to complete the tool, and were able to complete the patient visit efficiently. However, an optimal location for triggering the tool remained controversial. CONCLUSIONS: This study successfully combined “think-aloud” protocol analysis with “near-live” clinical simulations in a usability evaluation of a CDS tool that will be implemented into the emergency room environment. Both methods proved useful in the assessment of the CDS tool and allowed us to refine tool usability and workflow. Gunther Eysenbach 2015-09-10 /pmc/articles/PMC4797671/ /pubmed/27025540 http://dx.doi.org/10.2196/humanfactors.4537 Text en ©Anne Press, Lauren McCullagh, Sundas Khan, Andy Schachter, Salvatore Pardo, Thomas McGinn. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 10.09.2015. http://creativecommons.org/licenses/by/2.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Human Factors, is properly cited. The complete bibliographic information, a link to the original publication on http://humanfactors.jmir.org, as well as this copyright and license information must be included.
spellingShingle Original Paper
Press, Anne
McCullagh, Lauren
Khan, Sundas
Schachter, Andy
Pardo, Salvatore
McGinn, Thomas
Usability Testing of a Complex Clinical Decision Support Tool in the Emergency Department: Lessons Learned
title Usability Testing of a Complex Clinical Decision Support Tool in the Emergency Department: Lessons Learned
title_full Usability Testing of a Complex Clinical Decision Support Tool in the Emergency Department: Lessons Learned
title_fullStr Usability Testing of a Complex Clinical Decision Support Tool in the Emergency Department: Lessons Learned
title_full_unstemmed Usability Testing of a Complex Clinical Decision Support Tool in the Emergency Department: Lessons Learned
title_short Usability Testing of a Complex Clinical Decision Support Tool in the Emergency Department: Lessons Learned
title_sort usability testing of a complex clinical decision support tool in the emergency department: lessons learned
topic Original Paper
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4797671/
https://www.ncbi.nlm.nih.gov/pubmed/27025540
http://dx.doi.org/10.2196/humanfactors.4537
work_keys_str_mv AT pressanne usabilitytestingofacomplexclinicaldecisionsupporttoolintheemergencydepartmentlessonslearned
AT mccullaghlauren usabilitytestingofacomplexclinicaldecisionsupporttoolintheemergencydepartmentlessonslearned
AT khansundas usabilitytestingofacomplexclinicaldecisionsupporttoolintheemergencydepartmentlessonslearned
AT schachterandy usabilitytestingofacomplexclinicaldecisionsupporttoolintheemergencydepartmentlessonslearned
AT pardosalvatore usabilitytestingofacomplexclinicaldecisionsupporttoolintheemergencydepartmentlessonslearned
AT mcginnthomas usabilitytestingofacomplexclinicaldecisionsupporttoolintheemergencydepartmentlessonslearned