Cargando…
How Does Learnability of Primary Care Resident Physicians Increase After Seven Months of Using an Electronic Health Record? A Longitudinal Study
BACKGROUND: Electronic health records (EHRs) with poor usability present steep learning curves for new resident physicians, who are already overwhelmed in learning a new specialty. This may lead to error-prone use of EHRs in medical practice by new resident physicians. OBJECTIVE: The study goal was...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Gunther Eysenbach
2016
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4811662/ https://www.ncbi.nlm.nih.gov/pubmed/27025237 http://dx.doi.org/10.2196/humanfactors.4601 |
_version_ | 1782424007207813120 |
---|---|
author | Clarke, Martina A Belden, Jeffery L Kim, Min Soon |
author_facet | Clarke, Martina A Belden, Jeffery L Kim, Min Soon |
author_sort | Clarke, Martina A |
collection | PubMed |
description | BACKGROUND: Electronic health records (EHRs) with poor usability present steep learning curves for new resident physicians, who are already overwhelmed in learning a new specialty. This may lead to error-prone use of EHRs in medical practice by new resident physicians. OBJECTIVE: The study goal was to determine learnability gaps between expert and novice primary care resident physician groups by comparing performance measures when using EHRs. METHODS: We compared performance measures after two rounds of learnability tests (November 12, 2013 to December 19, 2013; February 12, 2014 to April 22, 2014). In Rounds 1 and 2, 10 novice and 6 expert physicians, and 8 novice and 4 expert physicians participated, respectively. Laboratory-based learnability tests using video analyses were conducted to analyze learnability gaps between novice and expert physicians. Physicians completed 19 tasks, using a think-aloud strategy, based on an artificial but typical patient visit note. We used quantitative performance measures (percent task success, time-on-task, mouse activities), a system usability scale (SUS), and qualitative narrative feedback during the participant debriefing session. RESULTS: There was a 6-percentage-point increase in novice physicians’ task success rate (Round 1: 92%, 95% CI 87-99; Round 2: 98%, 95% CI 95-100) and a 7-percentage-point increase in expert physicians’ task success rate (Round 1: 90%, 95% CI 83-97; Round 2: 97%, 95% CI 93-100); a 10% decrease in novice physicians’ time-on-task (Round 1: 44s, 95% CI 32-62; Round 2: 40s, 95% CI 27-59) and 21% decrease in expert physicians’ time-on-task (Round 1: 39s, 95% CI 29-51; Round 2: 31s, 95% CI 22-42); a 20% decrease in novice physicians mouse clicks (Round 1: 8 clicks, 95% CI 6-13; Round 2: 7 clicks, 95% CI 4-12) and 39% decrease in expert physicians’ mouse clicks (Round 1: 8 clicks, 95% CI 5-11; Round 2: 3 clicks, 95% CI 1-10); a 14% increase in novice mouse movements (Round 1: 9247 pixels, 95% CI 6404-13,353; Round 2: 7991 pixels, 95% CI 5350-11,936) and 14% decrease in expert physicians’ mouse movements (Round 1: 7325 pixels, 95% CI 5237-10,247; Round 2: 6329 pixels, 95% CI 4299-9317). The SUS measure of overall usability demonstrated only minimal change in the novice group (Round 1: 69, high marginal; Round 2: 68, high marginal) and no change in the expert group (74; high marginal for both rounds). CONCLUSIONS: This study found differences in novice and expert physicians’ performance, demonstrating that physicians’ proficiency increased with EHR experience. Our study may serve as a guideline to improve current EHR training programs. Future directions include identifying usability issues faced by physicians when using EHRs, through a more granular task analysis to recognize subtle usability issues that would otherwise be overlooked. |
format | Online Article Text |
id | pubmed-4811662 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2016 |
publisher | Gunther Eysenbach |
record_format | MEDLINE/PubMed |
spelling | pubmed-48116622016-04-15 How Does Learnability of Primary Care Resident Physicians Increase After Seven Months of Using an Electronic Health Record? A Longitudinal Study Clarke, Martina A Belden, Jeffery L Kim, Min Soon JMIR Hum Factors Original Paper BACKGROUND: Electronic health records (EHRs) with poor usability present steep learning curves for new resident physicians, who are already overwhelmed in learning a new specialty. This may lead to error-prone use of EHRs in medical practice by new resident physicians. OBJECTIVE: The study goal was to determine learnability gaps between expert and novice primary care resident physician groups by comparing performance measures when using EHRs. METHODS: We compared performance measures after two rounds of learnability tests (November 12, 2013 to December 19, 2013; February 12, 2014 to April 22, 2014). In Rounds 1 and 2, 10 novice and 6 expert physicians, and 8 novice and 4 expert physicians participated, respectively. Laboratory-based learnability tests using video analyses were conducted to analyze learnability gaps between novice and expert physicians. Physicians completed 19 tasks, using a think-aloud strategy, based on an artificial but typical patient visit note. We used quantitative performance measures (percent task success, time-on-task, mouse activities), a system usability scale (SUS), and qualitative narrative feedback during the participant debriefing session. RESULTS: There was a 6-percentage-point increase in novice physicians’ task success rate (Round 1: 92%, 95% CI 87-99; Round 2: 98%, 95% CI 95-100) and a 7-percentage-point increase in expert physicians’ task success rate (Round 1: 90%, 95% CI 83-97; Round 2: 97%, 95% CI 93-100); a 10% decrease in novice physicians’ time-on-task (Round 1: 44s, 95% CI 32-62; Round 2: 40s, 95% CI 27-59) and 21% decrease in expert physicians’ time-on-task (Round 1: 39s, 95% CI 29-51; Round 2: 31s, 95% CI 22-42); a 20% decrease in novice physicians mouse clicks (Round 1: 8 clicks, 95% CI 6-13; Round 2: 7 clicks, 95% CI 4-12) and 39% decrease in expert physicians’ mouse clicks (Round 1: 8 clicks, 95% CI 5-11; Round 2: 3 clicks, 95% CI 1-10); a 14% increase in novice mouse movements (Round 1: 9247 pixels, 95% CI 6404-13,353; Round 2: 7991 pixels, 95% CI 5350-11,936) and 14% decrease in expert physicians’ mouse movements (Round 1: 7325 pixels, 95% CI 5237-10,247; Round 2: 6329 pixels, 95% CI 4299-9317). The SUS measure of overall usability demonstrated only minimal change in the novice group (Round 1: 69, high marginal; Round 2: 68, high marginal) and no change in the expert group (74; high marginal for both rounds). CONCLUSIONS: This study found differences in novice and expert physicians’ performance, demonstrating that physicians’ proficiency increased with EHR experience. Our study may serve as a guideline to improve current EHR training programs. Future directions include identifying usability issues faced by physicians when using EHRs, through a more granular task analysis to recognize subtle usability issues that would otherwise be overlooked. Gunther Eysenbach 2016-02-15 /pmc/articles/PMC4811662/ /pubmed/27025237 http://dx.doi.org/10.2196/humanfactors.4601 Text en ©Martina A Clarke, Jeffery L Belden, Min Soon Kim. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 15.02.2016. https://creativecommons.org/licenses/by/2.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0/ (https://creativecommons.org/licenses/by/2.0/) ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Human Factors, is properly cited. The complete bibliographic information, a link to the original publication on http://humanfactors.jmir.org, as well as this copyright and license information must be included. |
spellingShingle | Original Paper Clarke, Martina A Belden, Jeffery L Kim, Min Soon How Does Learnability of Primary Care Resident Physicians Increase After Seven Months of Using an Electronic Health Record? A Longitudinal Study |
title | How Does Learnability of Primary Care Resident Physicians Increase After Seven Months of Using an Electronic Health Record? A Longitudinal Study |
title_full | How Does Learnability of Primary Care Resident Physicians Increase After Seven Months of Using an Electronic Health Record? A Longitudinal Study |
title_fullStr | How Does Learnability of Primary Care Resident Physicians Increase After Seven Months of Using an Electronic Health Record? A Longitudinal Study |
title_full_unstemmed | How Does Learnability of Primary Care Resident Physicians Increase After Seven Months of Using an Electronic Health Record? A Longitudinal Study |
title_short | How Does Learnability of Primary Care Resident Physicians Increase After Seven Months of Using an Electronic Health Record? A Longitudinal Study |
title_sort | how does learnability of primary care resident physicians increase after seven months of using an electronic health record? a longitudinal study |
topic | Original Paper |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4811662/ https://www.ncbi.nlm.nih.gov/pubmed/27025237 http://dx.doi.org/10.2196/humanfactors.4601 |
work_keys_str_mv | AT clarkemartinaa howdoeslearnabilityofprimarycareresidentphysiciansincreaseaftersevenmonthsofusinganelectronichealthrecordalongitudinalstudy AT beldenjefferyl howdoeslearnabilityofprimarycareresidentphysiciansincreaseaftersevenmonthsofusinganelectronichealthrecordalongitudinalstudy AT kimminsoon howdoeslearnabilityofprimarycareresidentphysiciansincreaseaftersevenmonthsofusinganelectronichealthrecordalongitudinalstudy |