Cargando…

Stigma, biomarkers, and algorithmic bias: recommendations for precision behavioral health with artificial intelligence

Effective implementation of artificial intelligence in behavioral healthcare delivery depends on overcoming challenges that are pronounced in this domain. Self and social stigma contribute to under-reported symptoms, and under-coding worsens ascertainment. Health disparities contribute to algorithmi...

Descripción completa

Detalles Bibliográficos
Autores principales: Walsh, Colin G, Chaudhry, Beenish, Dua, Prerna, Goodman, Kenneth W, Kaplan, Bonnie, Kavuluru, Ramakanth, Solomonides, Anthony, Subbian, Vignesh
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Oxford University Press 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7309258/
https://www.ncbi.nlm.nih.gov/pubmed/32607482
http://dx.doi.org/10.1093/jamiaopen/ooz054
_version_ 1783549176634671104
author Walsh, Colin G
Chaudhry, Beenish
Dua, Prerna
Goodman, Kenneth W
Kaplan, Bonnie
Kavuluru, Ramakanth
Solomonides, Anthony
Subbian, Vignesh
author_facet Walsh, Colin G
Chaudhry, Beenish
Dua, Prerna
Goodman, Kenneth W
Kaplan, Bonnie
Kavuluru, Ramakanth
Solomonides, Anthony
Subbian, Vignesh
author_sort Walsh, Colin G
collection PubMed
description Effective implementation of artificial intelligence in behavioral healthcare delivery depends on overcoming challenges that are pronounced in this domain. Self and social stigma contribute to under-reported symptoms, and under-coding worsens ascertainment. Health disparities contribute to algorithmic bias. Lack of reliable biological and clinical markers hinders model development, and model explainability challenges impede trust among users. In this perspective, we describe these challenges and discuss design and implementation recommendations to overcome them in intelligent systems for behavioral and mental health.
format Online
Article
Text
id pubmed-7309258
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Oxford University Press
record_format MEDLINE/PubMed
spelling pubmed-73092582020-06-29 Stigma, biomarkers, and algorithmic bias: recommendations for precision behavioral health with artificial intelligence Walsh, Colin G Chaudhry, Beenish Dua, Prerna Goodman, Kenneth W Kaplan, Bonnie Kavuluru, Ramakanth Solomonides, Anthony Subbian, Vignesh JAMIA Open Perspective Effective implementation of artificial intelligence in behavioral healthcare delivery depends on overcoming challenges that are pronounced in this domain. Self and social stigma contribute to under-reported symptoms, and under-coding worsens ascertainment. Health disparities contribute to algorithmic bias. Lack of reliable biological and clinical markers hinders model development, and model explainability challenges impede trust among users. In this perspective, we describe these challenges and discuss design and implementation recommendations to overcome them in intelligent systems for behavioral and mental health. Oxford University Press 2020-01-22 /pmc/articles/PMC7309258/ /pubmed/32607482 http://dx.doi.org/10.1093/jamiaopen/ooz054 Text en © The Author(s) 2020. Published by Oxford University Press on behalf of the American Medical Informatics Association. http://creativecommons.org/licenses/by-nc/4.0/ This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited. For commercial re-use, please contact journals.permissions@oup.com
spellingShingle Perspective
Walsh, Colin G
Chaudhry, Beenish
Dua, Prerna
Goodman, Kenneth W
Kaplan, Bonnie
Kavuluru, Ramakanth
Solomonides, Anthony
Subbian, Vignesh
Stigma, biomarkers, and algorithmic bias: recommendations for precision behavioral health with artificial intelligence
title Stigma, biomarkers, and algorithmic bias: recommendations for precision behavioral health with artificial intelligence
title_full Stigma, biomarkers, and algorithmic bias: recommendations for precision behavioral health with artificial intelligence
title_fullStr Stigma, biomarkers, and algorithmic bias: recommendations for precision behavioral health with artificial intelligence
title_full_unstemmed Stigma, biomarkers, and algorithmic bias: recommendations for precision behavioral health with artificial intelligence
title_short Stigma, biomarkers, and algorithmic bias: recommendations for precision behavioral health with artificial intelligence
title_sort stigma, biomarkers, and algorithmic bias: recommendations for precision behavioral health with artificial intelligence
topic Perspective
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7309258/
https://www.ncbi.nlm.nih.gov/pubmed/32607482
http://dx.doi.org/10.1093/jamiaopen/ooz054
work_keys_str_mv AT walshcoling stigmabiomarkersandalgorithmicbiasrecommendationsforprecisionbehavioralhealthwithartificialintelligence
AT chaudhrybeenish stigmabiomarkersandalgorithmicbiasrecommendationsforprecisionbehavioralhealthwithartificialintelligence
AT duaprerna stigmabiomarkersandalgorithmicbiasrecommendationsforprecisionbehavioralhealthwithartificialintelligence
AT goodmankennethw stigmabiomarkersandalgorithmicbiasrecommendationsforprecisionbehavioralhealthwithartificialintelligence
AT kaplanbonnie stigmabiomarkersandalgorithmicbiasrecommendationsforprecisionbehavioralhealthwithartificialintelligence
AT kavulururamakanth stigmabiomarkersandalgorithmicbiasrecommendationsforprecisionbehavioralhealthwithartificialintelligence
AT solomonidesanthony stigmabiomarkersandalgorithmicbiasrecommendationsforprecisionbehavioralhealthwithartificialintelligence
AT subbianvignesh stigmabiomarkersandalgorithmicbiasrecommendationsforprecisionbehavioralhealthwithartificialintelligence