Cargando…

Out with AI, in with the psychiatrist: a preference for human-derived clinical decision support in depression care

Advancements in artificial intelligence (AI) are enabling the development of clinical support tools (CSTs) in psychiatry to facilitate the review of patient data and inform clinical care. To promote their successful integration and prevent over-reliance, it is important to understand how psychiatris...

Descripción completa

Detalles Bibliográficos
Autores principales: Maslej, Marta M., Kloiber, Stefan, Ghassemi, Marzyeh, Yu, Joanna, Hill, Sean L.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10275935/
https://www.ncbi.nlm.nih.gov/pubmed/37328465
http://dx.doi.org/10.1038/s41398-023-02509-z
_version_ 1785059970826371072
author Maslej, Marta M.
Kloiber, Stefan
Ghassemi, Marzyeh
Yu, Joanna
Hill, Sean L.
author_facet Maslej, Marta M.
Kloiber, Stefan
Ghassemi, Marzyeh
Yu, Joanna
Hill, Sean L.
author_sort Maslej, Marta M.
collection PubMed
description Advancements in artificial intelligence (AI) are enabling the development of clinical support tools (CSTs) in psychiatry to facilitate the review of patient data and inform clinical care. To promote their successful integration and prevent over-reliance, it is important to understand how psychiatrists will respond to information provided by AI-based CSTs, particularly if it is incorrect. We conducted an experiment to examine psychiatrists’ perceptions of AI-based CSTs for treating major depressive disorder (MDD) and to determine whether perceptions interacted with the quality of CST information. Eighty-three psychiatrists read clinical notes about a hypothetical patient with MDD and reviewed two CSTs embedded within a single dashboard: the note’s summary and a treatment recommendation. Psychiatrists were randomised to believe the source of CSTs was either AI or another psychiatrist, and across four notes, CSTs provided either correct or incorrect information. Psychiatrists rated the CSTs on various attributes. Ratings for note summaries were less favourable when psychiatrists believed the notes were generated with AI as compared to another psychiatrist, regardless of whether the notes provided correct or incorrect information. A smaller preference for psychiatrist-generated information emerged in ratings of attributes that reflected the summary’s accuracy or its inclusion of important information from the full clinical note. Ratings for treatment recommendations were also less favourable when their perceived source was AI, but only when recommendations were correct. There was little evidence that clinical expertise or familiarity with AI impacted results. These findings suggest that psychiatrists prefer human-derived CSTs. This preference was less pronounced for ratings that may have prompted a deeper review of CST information (i.e. a comparison with the full clinical note to evaluate the summary’s accuracy or completeness, assessing an incorrect treatment recommendation), suggesting a role of heuristics. Future work should explore other contributing factors and downstream implications for integrating AI into psychiatric care.
format Online
Article
Text
id pubmed-10275935
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-102759352023-06-18 Out with AI, in with the psychiatrist: a preference for human-derived clinical decision support in depression care Maslej, Marta M. Kloiber, Stefan Ghassemi, Marzyeh Yu, Joanna Hill, Sean L. Transl Psychiatry Article Advancements in artificial intelligence (AI) are enabling the development of clinical support tools (CSTs) in psychiatry to facilitate the review of patient data and inform clinical care. To promote their successful integration and prevent over-reliance, it is important to understand how psychiatrists will respond to information provided by AI-based CSTs, particularly if it is incorrect. We conducted an experiment to examine psychiatrists’ perceptions of AI-based CSTs for treating major depressive disorder (MDD) and to determine whether perceptions interacted with the quality of CST information. Eighty-three psychiatrists read clinical notes about a hypothetical patient with MDD and reviewed two CSTs embedded within a single dashboard: the note’s summary and a treatment recommendation. Psychiatrists were randomised to believe the source of CSTs was either AI or another psychiatrist, and across four notes, CSTs provided either correct or incorrect information. Psychiatrists rated the CSTs on various attributes. Ratings for note summaries were less favourable when psychiatrists believed the notes were generated with AI as compared to another psychiatrist, regardless of whether the notes provided correct or incorrect information. A smaller preference for psychiatrist-generated information emerged in ratings of attributes that reflected the summary’s accuracy or its inclusion of important information from the full clinical note. Ratings for treatment recommendations were also less favourable when their perceived source was AI, but only when recommendations were correct. There was little evidence that clinical expertise or familiarity with AI impacted results. These findings suggest that psychiatrists prefer human-derived CSTs. This preference was less pronounced for ratings that may have prompted a deeper review of CST information (i.e. a comparison with the full clinical note to evaluate the summary’s accuracy or completeness, assessing an incorrect treatment recommendation), suggesting a role of heuristics. Future work should explore other contributing factors and downstream implications for integrating AI into psychiatric care. Nature Publishing Group UK 2023-06-16 /pmc/articles/PMC10275935/ /pubmed/37328465 http://dx.doi.org/10.1038/s41398-023-02509-z Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Maslej, Marta M.
Kloiber, Stefan
Ghassemi, Marzyeh
Yu, Joanna
Hill, Sean L.
Out with AI, in with the psychiatrist: a preference for human-derived clinical decision support in depression care
title Out with AI, in with the psychiatrist: a preference for human-derived clinical decision support in depression care
title_full Out with AI, in with the psychiatrist: a preference for human-derived clinical decision support in depression care
title_fullStr Out with AI, in with the psychiatrist: a preference for human-derived clinical decision support in depression care
title_full_unstemmed Out with AI, in with the psychiatrist: a preference for human-derived clinical decision support in depression care
title_short Out with AI, in with the psychiatrist: a preference for human-derived clinical decision support in depression care
title_sort out with ai, in with the psychiatrist: a preference for human-derived clinical decision support in depression care
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10275935/
https://www.ncbi.nlm.nih.gov/pubmed/37328465
http://dx.doi.org/10.1038/s41398-023-02509-z
work_keys_str_mv AT maslejmartam outwithaiinwiththepsychiatristapreferenceforhumanderivedclinicaldecisionsupportindepressioncare
AT kloiberstefan outwithaiinwiththepsychiatristapreferenceforhumanderivedclinicaldecisionsupportindepressioncare
AT ghassemimarzyeh outwithaiinwiththepsychiatristapreferenceforhumanderivedclinicaldecisionsupportindepressioncare
AT yujoanna outwithaiinwiththepsychiatristapreferenceforhumanderivedclinicaldecisionsupportindepressioncare
AT hillseanl outwithaiinwiththepsychiatristapreferenceforhumanderivedclinicaldecisionsupportindepressioncare