Cargando…
Visual Acuity: Assessment of Data Quality and Usability in an Electronic Health Record System
OBJECTIVE: To examine the data quality and usability of visual acuity (VA) data extracted from an electronic health record (EHR) system during ophthalmology encounters and provide recommendations for consideration of relevant VA end points in retrospective analyses. DESIGN: Retrospective, EHR data a...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Elsevier
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9574716/ https://www.ncbi.nlm.nih.gov/pubmed/36275199 http://dx.doi.org/10.1016/j.xops.2022.100215 |
Sumario: | OBJECTIVE: To examine the data quality and usability of visual acuity (VA) data extracted from an electronic health record (EHR) system during ophthalmology encounters and provide recommendations for consideration of relevant VA end points in retrospective analyses. DESIGN: Retrospective, EHR data analysis. PARTICIPANTS: All patients with eyecare office encounters at any 1 of the 9 locations of a large academic medical center between August 1, 2013, and December 31, 2015. METHODS: Data from 13 of the 21 VA fields (accounting for 93% VA data) in EHR encounters were extracted, categorized, recoded, and assessed for conformance and plausibility using an internal data dictionary, a 38-item listing of VA line measurements and observations including 28 line measurements (e.g., 20/30, 20/400) and 10 observations (e.g., no light perception). Entries were classified into usable and unusable data. Usable data were further categorized based on conformance to the internal data dictionary: (1) exact match; (2) conditional conformance, letter count (e.g., 20/30(+2)(-)(3)); (3) convertible conformance (e.g., 5/200 to 20/800); (4) plausible but cannot be conformed (e.g., 5/400). Data were deemed unusable when they were not plausible. MAIN OUTCOME MEASURES: Proportions of usable and unusable VA entries at the overall and subspecialty levels. RESULTS: All VA data from 513 036 encounters representing 166 212 patients were included. Of the 1 573 643 VA entries, 1 438 661 (91.4%) contained usable data. There were 1 196 720 (76.0%) exact match (category 1), 185 692 (11.8%) conditional conformance (category 2), 40 270 (2.6%) convertible conformance (category 3), and 15 979 (1.0%) plausible but not conformed entries (category 4). Visual acuity entries during visits with providers from retina (17.5%), glaucoma (14.0%), neuro-ophthalmology (8.9%), and low vision (8.8%) had the highest rates of unusable data. Documented VA entries with providers from comprehensive eyecare (86.7%), oculoplastics (81.5%), and pediatrics/strabismus (78.6%) yielded the highest proportions of exact match with the data dictionary. CONCLUSIONS: Electronic health record VA data quality and usability vary across documented VA measures, observations, and eyecare subspecialty. We proposed a checklist of considerations and recommendations for planning, extracting, analyzing, and reporting retrospective study outcomes using EHR VA data. These are important first steps to standardize analyses enabling comparative research. |
---|