Cargando…
Employing computational linguistics techniques to identify limited patient health literacy: Findings from the ECLIPPSE study
OBJECTIVE: To develop novel, scalable, and valid literacy profiles for identifying limited health literacy patients by harnessing natural language processing. DATA SOURCE: With respect to the linguistic content, we analyzed 283 216 secure messages sent by 6941 diabetes patients to physicians within...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
John Wiley and Sons Inc.
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7839650/ https://www.ncbi.nlm.nih.gov/pubmed/32966630 http://dx.doi.org/10.1111/1475-6773.13560 |
_version_ | 1783643426550448128 |
---|---|
author | Schillinger, Dean Balyan, Renu Crossley, Scott A. McNamara, Danielle S. Liu, Jennifer Y. Karter, Andrew J. |
author_facet | Schillinger, Dean Balyan, Renu Crossley, Scott A. McNamara, Danielle S. Liu, Jennifer Y. Karter, Andrew J. |
author_sort | Schillinger, Dean |
collection | PubMed |
description | OBJECTIVE: To develop novel, scalable, and valid literacy profiles for identifying limited health literacy patients by harnessing natural language processing. DATA SOURCE: With respect to the linguistic content, we analyzed 283 216 secure messages sent by 6941 diabetes patients to physicians within an integrated system's electronic portal. Sociodemographic, clinical, and utilization data were obtained via questionnaire and electronic health records. STUDY DESIGN: Retrospective study used natural language processing and machine learning to generate five unique “Literacy Profiles” by employing various sets of linguistic indices: Flesch‐Kincaid (LP_FK); basic indices of writing complexity, including lexical diversity (LP_LD) and writing quality (LP_WQ); and advanced indices related to syntactic complexity, lexical sophistication, and diversity, modeled from self‐reported (LP_SR), and expert‐rated (LP_Exp) health literacy. We first determined the performance of each literacy profile relative to self‐reported and expert‐rated health literacy to discriminate between high and low health literacy and then assessed Literacy Profiles’ relationships with known correlates of health literacy, such as patient sociodemographics and a range of health‐related outcomes, including ratings of physician communication, medication adherence, diabetes control, comorbidities, and utilization. PRINCIPAL FINDINGS: LP_SR and LP_Exp performed best in discriminating between high and low self‐reported (C‐statistics: 0.86 and 0.58, respectively) and expert‐rated health literacy (C‐statistics: 0.71 and 0.87, respectively) and were significantly associated with educational attainment, race/ethnicity, Consumer Assessment of Provider and Systems (CAHPS) scores, adherence, glycemia, comorbidities, and emergency department visits. CONCLUSIONS: Since health literacy is a potentially remediable explanatory factor in health care disparities, the development of automated health literacy indicators represents a significant accomplishment with broad clinical and population health applications. Health systems could apply literacy profiles to efficiently determine whether quality of care and outcomes vary by patient health literacy; identify at‐risk populations for targeting tailored health communications and self‐management support interventions; and inform clinicians to promote improvements in individual‐level care. |
format | Online Article Text |
id | pubmed-7839650 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | John Wiley and Sons Inc. |
record_format | MEDLINE/PubMed |
spelling | pubmed-78396502021-02-02 Employing computational linguistics techniques to identify limited patient health literacy: Findings from the ECLIPPSE study Schillinger, Dean Balyan, Renu Crossley, Scott A. McNamara, Danielle S. Liu, Jennifer Y. Karter, Andrew J. Health Serv Res Evaluation Tools OBJECTIVE: To develop novel, scalable, and valid literacy profiles for identifying limited health literacy patients by harnessing natural language processing. DATA SOURCE: With respect to the linguistic content, we analyzed 283 216 secure messages sent by 6941 diabetes patients to physicians within an integrated system's electronic portal. Sociodemographic, clinical, and utilization data were obtained via questionnaire and electronic health records. STUDY DESIGN: Retrospective study used natural language processing and machine learning to generate five unique “Literacy Profiles” by employing various sets of linguistic indices: Flesch‐Kincaid (LP_FK); basic indices of writing complexity, including lexical diversity (LP_LD) and writing quality (LP_WQ); and advanced indices related to syntactic complexity, lexical sophistication, and diversity, modeled from self‐reported (LP_SR), and expert‐rated (LP_Exp) health literacy. We first determined the performance of each literacy profile relative to self‐reported and expert‐rated health literacy to discriminate between high and low health literacy and then assessed Literacy Profiles’ relationships with known correlates of health literacy, such as patient sociodemographics and a range of health‐related outcomes, including ratings of physician communication, medication adherence, diabetes control, comorbidities, and utilization. PRINCIPAL FINDINGS: LP_SR and LP_Exp performed best in discriminating between high and low self‐reported (C‐statistics: 0.86 and 0.58, respectively) and expert‐rated health literacy (C‐statistics: 0.71 and 0.87, respectively) and were significantly associated with educational attainment, race/ethnicity, Consumer Assessment of Provider and Systems (CAHPS) scores, adherence, glycemia, comorbidities, and emergency department visits. CONCLUSIONS: Since health literacy is a potentially remediable explanatory factor in health care disparities, the development of automated health literacy indicators represents a significant accomplishment with broad clinical and population health applications. Health systems could apply literacy profiles to efficiently determine whether quality of care and outcomes vary by patient health literacy; identify at‐risk populations for targeting tailored health communications and self‐management support interventions; and inform clinicians to promote improvements in individual‐level care. John Wiley and Sons Inc. 2020-09-23 2021-02 /pmc/articles/PMC7839650/ /pubmed/32966630 http://dx.doi.org/10.1111/1475-6773.13560 Text en © 2020 The Authors. Health Services Research published by Wiley Periodicals LLC on behalf of Health Research and Educational Trust This is an open access article under the terms of the http://creativecommons.org/licenses/by-nc-nd/4.0/ License, which permits use and distribution in any medium, provided the original work is properly cited, the use is non‐commercial and no modifications or adaptations are made. |
spellingShingle | Evaluation Tools Schillinger, Dean Balyan, Renu Crossley, Scott A. McNamara, Danielle S. Liu, Jennifer Y. Karter, Andrew J. Employing computational linguistics techniques to identify limited patient health literacy: Findings from the ECLIPPSE study |
title | Employing computational linguistics techniques to identify limited patient health literacy: Findings from the ECLIPPSE study |
title_full | Employing computational linguistics techniques to identify limited patient health literacy: Findings from the ECLIPPSE study |
title_fullStr | Employing computational linguistics techniques to identify limited patient health literacy: Findings from the ECLIPPSE study |
title_full_unstemmed | Employing computational linguistics techniques to identify limited patient health literacy: Findings from the ECLIPPSE study |
title_short | Employing computational linguistics techniques to identify limited patient health literacy: Findings from the ECLIPPSE study |
title_sort | employing computational linguistics techniques to identify limited patient health literacy: findings from the eclippse study |
topic | Evaluation Tools |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7839650/ https://www.ncbi.nlm.nih.gov/pubmed/32966630 http://dx.doi.org/10.1111/1475-6773.13560 |
work_keys_str_mv | AT schillingerdean employingcomputationallinguisticstechniquestoidentifylimitedpatienthealthliteracyfindingsfromtheeclippsestudy AT balyanrenu employingcomputationallinguisticstechniquestoidentifylimitedpatienthealthliteracyfindingsfromtheeclippsestudy AT crossleyscotta employingcomputationallinguisticstechniquestoidentifylimitedpatienthealthliteracyfindingsfromtheeclippsestudy AT mcnamaradanielles employingcomputationallinguisticstechniquestoidentifylimitedpatienthealthliteracyfindingsfromtheeclippsestudy AT liujennifery employingcomputationallinguisticstechniquestoidentifylimitedpatienthealthliteracyfindingsfromtheeclippsestudy AT karterandrewj employingcomputationallinguisticstechniquestoidentifylimitedpatienthealthliteracyfindingsfromtheeclippsestudy |