Cargando…
Diverse patients’ attitudes towards Artificial Intelligence (AI) in diagnosis
Artificial intelligence (AI) has the potential to improve diagnostic accuracy. Yet people are often reluctant to trust automated systems, and some patient populations may be particularly distrusting. We sought to determine how diverse patient populations feel about the use of AI diagnostic tools, an...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10198520/ https://www.ncbi.nlm.nih.gov/pubmed/37205713 http://dx.doi.org/10.1371/journal.pdig.0000237 |
_version_ | 1785044753402822656 |
---|---|
author | Robertson, Christopher Woods, Andrew Bergstrand, Kelly Findley, Jess Balser, Cayley Slepian, Marvin J. |
author_facet | Robertson, Christopher Woods, Andrew Bergstrand, Kelly Findley, Jess Balser, Cayley Slepian, Marvin J. |
author_sort | Robertson, Christopher |
collection | PubMed |
description | Artificial intelligence (AI) has the potential to improve diagnostic accuracy. Yet people are often reluctant to trust automated systems, and some patient populations may be particularly distrusting. We sought to determine how diverse patient populations feel about the use of AI diagnostic tools, and whether framing and informing the choice affects uptake. To construct and pretest our materials, we conducted structured interviews with a diverse set of actual patients. We then conducted a pre-registered (osf.io/9y26x), randomized, blinded survey experiment in factorial design. A survey firm provided n = 2675 responses, oversampling minoritized populations. Clinical vignettes were randomly manipulated in eight variables with two levels each: disease severity (leukemia versus sleep apnea), whether AI is proven more accurate than human specialists, whether the AI clinic is personalized to the patient through listening and/or tailoring, whether the AI clinic avoids racial and/or financial biases, whether the Primary Care Physician (PCP) promises to explain and incorporate the advice, and whether the PCP nudges the patient towards AI as the established, recommended, and easy choice. Our main outcome measure was selection of AI clinic or human physician specialist clinic (binary, “AI uptake”). We found that with weighting representative to the U.S. population, respondents were almost evenly split (52.9% chose human doctor and 47.1% chose AI clinic). In unweighted experimental contrasts of respondents who met pre-registered criteria for engagement, a PCP’s explanation that AI has proven superior accuracy increased uptake (OR = 1.48, CI 1.24–1.77, p < .001), as did a PCP’s nudge towards AI as the established choice (OR = 1.25, CI: 1.05–1.50, p = .013), as did reassurance that the AI clinic had trained counselors to listen to the patient’s unique perspectives (OR = 1.27, CI: 1.07–1.52, p = .008). Disease severity (leukemia versus sleep apnea) and other manipulations did not affect AI uptake significantly. Compared to White respondents, Black respondents selected AI less often (OR = .73, CI: .55-.96, p = .023) and Native Americans selected it more often (OR: 1.37, CI: 1.01–1.87, p = .041). Older respondents were less likely to choose AI (OR: .99, CI: .987-.999, p = .03), as were those who identified as politically conservative (OR: .65, CI: .52-.81, p < .001) or viewed religion as important (OR: .64, CI: .52-.77, p < .001). For each unit increase in education, the odds are 1.10 greater for selecting an AI provider (OR: 1.10, CI: 1.03–1.18, p = .004). While many patients appear resistant to the use of AI, accuracy information, nudges and a listening patient experience may help increase acceptance. To ensure that the benefits of AI are secured in clinical practice, future research on best methods of physician incorporation and patient decision making is required. |
format | Online Article Text |
id | pubmed-10198520 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-101985202023-05-20 Diverse patients’ attitudes towards Artificial Intelligence (AI) in diagnosis Robertson, Christopher Woods, Andrew Bergstrand, Kelly Findley, Jess Balser, Cayley Slepian, Marvin J. PLOS Digit Health Research Article Artificial intelligence (AI) has the potential to improve diagnostic accuracy. Yet people are often reluctant to trust automated systems, and some patient populations may be particularly distrusting. We sought to determine how diverse patient populations feel about the use of AI diagnostic tools, and whether framing and informing the choice affects uptake. To construct and pretest our materials, we conducted structured interviews with a diverse set of actual patients. We then conducted a pre-registered (osf.io/9y26x), randomized, blinded survey experiment in factorial design. A survey firm provided n = 2675 responses, oversampling minoritized populations. Clinical vignettes were randomly manipulated in eight variables with two levels each: disease severity (leukemia versus sleep apnea), whether AI is proven more accurate than human specialists, whether the AI clinic is personalized to the patient through listening and/or tailoring, whether the AI clinic avoids racial and/or financial biases, whether the Primary Care Physician (PCP) promises to explain and incorporate the advice, and whether the PCP nudges the patient towards AI as the established, recommended, and easy choice. Our main outcome measure was selection of AI clinic or human physician specialist clinic (binary, “AI uptake”). We found that with weighting representative to the U.S. population, respondents were almost evenly split (52.9% chose human doctor and 47.1% chose AI clinic). In unweighted experimental contrasts of respondents who met pre-registered criteria for engagement, a PCP’s explanation that AI has proven superior accuracy increased uptake (OR = 1.48, CI 1.24–1.77, p < .001), as did a PCP’s nudge towards AI as the established choice (OR = 1.25, CI: 1.05–1.50, p = .013), as did reassurance that the AI clinic had trained counselors to listen to the patient’s unique perspectives (OR = 1.27, CI: 1.07–1.52, p = .008). Disease severity (leukemia versus sleep apnea) and other manipulations did not affect AI uptake significantly. Compared to White respondents, Black respondents selected AI less often (OR = .73, CI: .55-.96, p = .023) and Native Americans selected it more often (OR: 1.37, CI: 1.01–1.87, p = .041). Older respondents were less likely to choose AI (OR: .99, CI: .987-.999, p = .03), as were those who identified as politically conservative (OR: .65, CI: .52-.81, p < .001) or viewed religion as important (OR: .64, CI: .52-.77, p < .001). For each unit increase in education, the odds are 1.10 greater for selecting an AI provider (OR: 1.10, CI: 1.03–1.18, p = .004). While many patients appear resistant to the use of AI, accuracy information, nudges and a listening patient experience may help increase acceptance. To ensure that the benefits of AI are secured in clinical practice, future research on best methods of physician incorporation and patient decision making is required. Public Library of Science 2023-05-19 /pmc/articles/PMC10198520/ /pubmed/37205713 http://dx.doi.org/10.1371/journal.pdig.0000237 Text en © 2023 Robertson et al https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. |
spellingShingle | Research Article Robertson, Christopher Woods, Andrew Bergstrand, Kelly Findley, Jess Balser, Cayley Slepian, Marvin J. Diverse patients’ attitudes towards Artificial Intelligence (AI) in diagnosis |
title | Diverse patients’ attitudes towards Artificial Intelligence (AI) in diagnosis |
title_full | Diverse patients’ attitudes towards Artificial Intelligence (AI) in diagnosis |
title_fullStr | Diverse patients’ attitudes towards Artificial Intelligence (AI) in diagnosis |
title_full_unstemmed | Diverse patients’ attitudes towards Artificial Intelligence (AI) in diagnosis |
title_short | Diverse patients’ attitudes towards Artificial Intelligence (AI) in diagnosis |
title_sort | diverse patients’ attitudes towards artificial intelligence (ai) in diagnosis |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10198520/ https://www.ncbi.nlm.nih.gov/pubmed/37205713 http://dx.doi.org/10.1371/journal.pdig.0000237 |
work_keys_str_mv | AT robertsonchristopher diversepatientsattitudestowardsartificialintelligenceaiindiagnosis AT woodsandrew diversepatientsattitudestowardsartificialintelligenceaiindiagnosis AT bergstrandkelly diversepatientsattitudestowardsartificialintelligenceaiindiagnosis AT findleyjess diversepatientsattitudestowardsartificialintelligenceaiindiagnosis AT balsercayley diversepatientsattitudestowardsartificialintelligenceaiindiagnosis AT slepianmarvinj diversepatientsattitudestowardsartificialintelligenceaiindiagnosis |