Cargando…

The influence of explainable vs non-explainable clinical decision support systems on rapid triage decisions: a mixed methods study

BACKGROUND: During the COVID-19 pandemic, a variety of clinical decision support systems (CDSS) were developed to aid patient triage. However, research focusing on the interaction between decision support systems and human experts is lacking. METHODS: Thirty-two physicians were recruited to rate the...

Descripción completa

Detalles Bibliográficos
Autores principales: Laxar, Daniel, Eitenberger, Magdalena, Maleczek, Mathias, Kaider, Alexandra, Hammerle, Fabian Peter, Kimberger, Oliver
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10510231/
https://www.ncbi.nlm.nih.gov/pubmed/37726729
http://dx.doi.org/10.1186/s12916-023-03068-2
_version_ 1785107923044663296
author Laxar, Daniel
Eitenberger, Magdalena
Maleczek, Mathias
Kaider, Alexandra
Hammerle, Fabian Peter
Kimberger, Oliver
author_facet Laxar, Daniel
Eitenberger, Magdalena
Maleczek, Mathias
Kaider, Alexandra
Hammerle, Fabian Peter
Kimberger, Oliver
author_sort Laxar, Daniel
collection PubMed
description BACKGROUND: During the COVID-19 pandemic, a variety of clinical decision support systems (CDSS) were developed to aid patient triage. However, research focusing on the interaction between decision support systems and human experts is lacking. METHODS: Thirty-two physicians were recruited to rate the survival probability of 59 critically ill patients by means of chart review. Subsequently, one of two artificial intelligence systems advised the physician of a computed survival probability. However, only one of these systems explained the reasons behind its decision-making. In the third step, physicians reviewed the chart once again to determine the final survival probability rating. We hypothesized that an explaining system would exhibit a higher impact on the physicians’ second rating (i.e., higher weight-on-advice). RESULTS: The survival probability rating given by the physician after receiving advice from the clinical decision support system was a median of 4 percentage points closer to the advice than the initial rating. Weight-on-advice was not significantly different (p = 0.115) between the two systems (with vs without explanation for its decision). Additionally, weight-on-advice showed no difference according to time of day or between board-qualified and not yet board-qualified physicians. Self-reported post-experiment overall trust was awarded a median of 4 out of 10 points. When asked after the conclusion of the experiment, overall trust was 5.5/10 (non-explaining median 4 (IQR 3.5–5.5), explaining median 7 (IQR 5.5–7.5), p = 0.007). CONCLUSIONS: Although overall trust in the models was low, the median (IQR) weight-on-advice was high (0.33 (0.0–0.56)) and in line with published literature on expert advice. In contrast to the hypothesis, weight-on-advice was comparable between the explaining and non-explaining systems. In 30% of cases, weight-on-advice was 0, meaning the physician did not change their rating. The median of the remaining weight-on-advice values was 50%, suggesting that physicians either dismissed the recommendation or employed a “meeting halfway” approach. Newer technologies, such as clinical reasoning systems, may be able to augment the decision process rather than simply presenting unexplained bias. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1186/s12916-023-03068-2.
format Online
Article
Text
id pubmed-10510231
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-105102312023-09-21 The influence of explainable vs non-explainable clinical decision support systems on rapid triage decisions: a mixed methods study Laxar, Daniel Eitenberger, Magdalena Maleczek, Mathias Kaider, Alexandra Hammerle, Fabian Peter Kimberger, Oliver BMC Med Research Article BACKGROUND: During the COVID-19 pandemic, a variety of clinical decision support systems (CDSS) were developed to aid patient triage. However, research focusing on the interaction between decision support systems and human experts is lacking. METHODS: Thirty-two physicians were recruited to rate the survival probability of 59 critically ill patients by means of chart review. Subsequently, one of two artificial intelligence systems advised the physician of a computed survival probability. However, only one of these systems explained the reasons behind its decision-making. In the third step, physicians reviewed the chart once again to determine the final survival probability rating. We hypothesized that an explaining system would exhibit a higher impact on the physicians’ second rating (i.e., higher weight-on-advice). RESULTS: The survival probability rating given by the physician after receiving advice from the clinical decision support system was a median of 4 percentage points closer to the advice than the initial rating. Weight-on-advice was not significantly different (p = 0.115) between the two systems (with vs without explanation for its decision). Additionally, weight-on-advice showed no difference according to time of day or between board-qualified and not yet board-qualified physicians. Self-reported post-experiment overall trust was awarded a median of 4 out of 10 points. When asked after the conclusion of the experiment, overall trust was 5.5/10 (non-explaining median 4 (IQR 3.5–5.5), explaining median 7 (IQR 5.5–7.5), p = 0.007). CONCLUSIONS: Although overall trust in the models was low, the median (IQR) weight-on-advice was high (0.33 (0.0–0.56)) and in line with published literature on expert advice. In contrast to the hypothesis, weight-on-advice was comparable between the explaining and non-explaining systems. In 30% of cases, weight-on-advice was 0, meaning the physician did not change their rating. The median of the remaining weight-on-advice values was 50%, suggesting that physicians either dismissed the recommendation or employed a “meeting halfway” approach. Newer technologies, such as clinical reasoning systems, may be able to augment the decision process rather than simply presenting unexplained bias. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1186/s12916-023-03068-2. BioMed Central 2023-09-19 /pmc/articles/PMC10510231/ /pubmed/37726729 http://dx.doi.org/10.1186/s12916-023-03068-2 Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/ (https://creativecommons.org/publicdomain/zero/1.0/) ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
spellingShingle Research Article
Laxar, Daniel
Eitenberger, Magdalena
Maleczek, Mathias
Kaider, Alexandra
Hammerle, Fabian Peter
Kimberger, Oliver
The influence of explainable vs non-explainable clinical decision support systems on rapid triage decisions: a mixed methods study
title The influence of explainable vs non-explainable clinical decision support systems on rapid triage decisions: a mixed methods study
title_full The influence of explainable vs non-explainable clinical decision support systems on rapid triage decisions: a mixed methods study
title_fullStr The influence of explainable vs non-explainable clinical decision support systems on rapid triage decisions: a mixed methods study
title_full_unstemmed The influence of explainable vs non-explainable clinical decision support systems on rapid triage decisions: a mixed methods study
title_short The influence of explainable vs non-explainable clinical decision support systems on rapid triage decisions: a mixed methods study
title_sort influence of explainable vs non-explainable clinical decision support systems on rapid triage decisions: a mixed methods study
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10510231/
https://www.ncbi.nlm.nih.gov/pubmed/37726729
http://dx.doi.org/10.1186/s12916-023-03068-2
work_keys_str_mv AT laxardaniel theinfluenceofexplainablevsnonexplainableclinicaldecisionsupportsystemsonrapidtriagedecisionsamixedmethodsstudy
AT eitenbergermagdalena theinfluenceofexplainablevsnonexplainableclinicaldecisionsupportsystemsonrapidtriagedecisionsamixedmethodsstudy
AT maleczekmathias theinfluenceofexplainablevsnonexplainableclinicaldecisionsupportsystemsonrapidtriagedecisionsamixedmethodsstudy
AT kaideralexandra theinfluenceofexplainablevsnonexplainableclinicaldecisionsupportsystemsonrapidtriagedecisionsamixedmethodsstudy
AT hammerlefabianpeter theinfluenceofexplainablevsnonexplainableclinicaldecisionsupportsystemsonrapidtriagedecisionsamixedmethodsstudy
AT kimbergeroliver theinfluenceofexplainablevsnonexplainableclinicaldecisionsupportsystemsonrapidtriagedecisionsamixedmethodsstudy
AT laxardaniel influenceofexplainablevsnonexplainableclinicaldecisionsupportsystemsonrapidtriagedecisionsamixedmethodsstudy
AT eitenbergermagdalena influenceofexplainablevsnonexplainableclinicaldecisionsupportsystemsonrapidtriagedecisionsamixedmethodsstudy
AT maleczekmathias influenceofexplainablevsnonexplainableclinicaldecisionsupportsystemsonrapidtriagedecisionsamixedmethodsstudy
AT kaideralexandra influenceofexplainablevsnonexplainableclinicaldecisionsupportsystemsonrapidtriagedecisionsamixedmethodsstudy
AT hammerlefabianpeter influenceofexplainablevsnonexplainableclinicaldecisionsupportsystemsonrapidtriagedecisionsamixedmethodsstudy
AT kimbergeroliver influenceofexplainablevsnonexplainableclinicaldecisionsupportsystemsonrapidtriagedecisionsamixedmethodsstudy