Cargando…

Explainable machine learning to predict long-term mortality in critically ill ventilated patients: a retrospective study in central Taiwan

BACKGROUND: Machine learning (ML) model is increasingly used to predict short-term outcome in critically ill patients, but the study for long-term outcome is sparse. We used explainable ML approach to establish 30-day, 90-day and 1-year mortality prediction model in critically ill ventilated patient...

Descripción completa

Detalles Bibliográficos
Autores principales: Chan, Ming-Cheng, Pai, Kai-Chih, Su, Shao-An, Wang, Min-Shian, Wu, Chieh-Liang, Chao, Wen-Cheng
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8953968/
https://www.ncbi.nlm.nih.gov/pubmed/35337303
http://dx.doi.org/10.1186/s12911-022-01817-6
_version_ 1784675978921902080
author Chan, Ming-Cheng
Pai, Kai-Chih
Su, Shao-An
Wang, Min-Shian
Wu, Chieh-Liang
Chao, Wen-Cheng
author_facet Chan, Ming-Cheng
Pai, Kai-Chih
Su, Shao-An
Wang, Min-Shian
Wu, Chieh-Liang
Chao, Wen-Cheng
author_sort Chan, Ming-Cheng
collection PubMed
description BACKGROUND: Machine learning (ML) model is increasingly used to predict short-term outcome in critically ill patients, but the study for long-term outcome is sparse. We used explainable ML approach to establish 30-day, 90-day and 1-year mortality prediction model in critically ill ventilated patients. METHODS: We retrospectively included patients who were admitted to intensive care units during 2015–2018 at a tertiary hospital in central Taiwan and linked with the Taiwanese nationwide death registration data. Three ML models, including extreme gradient boosting (XGBoost), random forest (RF) and logistic regression (LR), were used to establish mortality prediction model. Furthermore, we used feature importance, Shapley Additive exPlanations (SHAP) plot, partial dependence plot (PDP), and local interpretable model-agnostic explanations (LIME) to explain the established model. RESULTS: We enrolled 6994 patients and found the accuracy was similar among the three ML models, and the area under the curve value of using XGBoost to predict 30-day, 90-day and 1-year mortality were 0.858, 0.839 and 0.816, respectively. The calibration curve and decision curve analysis further demonstrated accuracy and applicability of models. SHAP summary plot and PDP plot illustrated the discriminative point of APACHE (acute physiology and chronic health exam) II score, haemoglobin and albumin to predict 1-year mortality. The application of LIME and SHAP force plots quantified the probability of 1-year mortality and algorithm of key features at individual patient level. CONCLUSIONS: We used an explainable ML approach, mainly XGBoost, SHAP and LIME plots to establish an explainable 1-year mortality prediction ML model in critically ill ventilated patients. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1186/s12911-022-01817-6.
format Online
Article
Text
id pubmed-8953968
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-89539682022-03-27 Explainable machine learning to predict long-term mortality in critically ill ventilated patients: a retrospective study in central Taiwan Chan, Ming-Cheng Pai, Kai-Chih Su, Shao-An Wang, Min-Shian Wu, Chieh-Liang Chao, Wen-Cheng BMC Med Inform Decis Mak Research BACKGROUND: Machine learning (ML) model is increasingly used to predict short-term outcome in critically ill patients, but the study for long-term outcome is sparse. We used explainable ML approach to establish 30-day, 90-day and 1-year mortality prediction model in critically ill ventilated patients. METHODS: We retrospectively included patients who were admitted to intensive care units during 2015–2018 at a tertiary hospital in central Taiwan and linked with the Taiwanese nationwide death registration data. Three ML models, including extreme gradient boosting (XGBoost), random forest (RF) and logistic regression (LR), were used to establish mortality prediction model. Furthermore, we used feature importance, Shapley Additive exPlanations (SHAP) plot, partial dependence plot (PDP), and local interpretable model-agnostic explanations (LIME) to explain the established model. RESULTS: We enrolled 6994 patients and found the accuracy was similar among the three ML models, and the area under the curve value of using XGBoost to predict 30-day, 90-day and 1-year mortality were 0.858, 0.839 and 0.816, respectively. The calibration curve and decision curve analysis further demonstrated accuracy and applicability of models. SHAP summary plot and PDP plot illustrated the discriminative point of APACHE (acute physiology and chronic health exam) II score, haemoglobin and albumin to predict 1-year mortality. The application of LIME and SHAP force plots quantified the probability of 1-year mortality and algorithm of key features at individual patient level. CONCLUSIONS: We used an explainable ML approach, mainly XGBoost, SHAP and LIME plots to establish an explainable 1-year mortality prediction ML model in critically ill ventilated patients. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1186/s12911-022-01817-6. BioMed Central 2022-03-25 /pmc/articles/PMC8953968/ /pubmed/35337303 http://dx.doi.org/10.1186/s12911-022-01817-6 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/ (https://creativecommons.org/publicdomain/zero/1.0/) ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
spellingShingle Research
Chan, Ming-Cheng
Pai, Kai-Chih
Su, Shao-An
Wang, Min-Shian
Wu, Chieh-Liang
Chao, Wen-Cheng
Explainable machine learning to predict long-term mortality in critically ill ventilated patients: a retrospective study in central Taiwan
title Explainable machine learning to predict long-term mortality in critically ill ventilated patients: a retrospective study in central Taiwan
title_full Explainable machine learning to predict long-term mortality in critically ill ventilated patients: a retrospective study in central Taiwan
title_fullStr Explainable machine learning to predict long-term mortality in critically ill ventilated patients: a retrospective study in central Taiwan
title_full_unstemmed Explainable machine learning to predict long-term mortality in critically ill ventilated patients: a retrospective study in central Taiwan
title_short Explainable machine learning to predict long-term mortality in critically ill ventilated patients: a retrospective study in central Taiwan
title_sort explainable machine learning to predict long-term mortality in critically ill ventilated patients: a retrospective study in central taiwan
topic Research
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8953968/
https://www.ncbi.nlm.nih.gov/pubmed/35337303
http://dx.doi.org/10.1186/s12911-022-01817-6
work_keys_str_mv AT chanmingcheng explainablemachinelearningtopredictlongtermmortalityincriticallyillventilatedpatientsaretrospectivestudyincentraltaiwan
AT paikaichih explainablemachinelearningtopredictlongtermmortalityincriticallyillventilatedpatientsaretrospectivestudyincentraltaiwan
AT sushaoan explainablemachinelearningtopredictlongtermmortalityincriticallyillventilatedpatientsaretrospectivestudyincentraltaiwan
AT wangminshian explainablemachinelearningtopredictlongtermmortalityincriticallyillventilatedpatientsaretrospectivestudyincentraltaiwan
AT wuchiehliang explainablemachinelearningtopredictlongtermmortalityincriticallyillventilatedpatientsaretrospectivestudyincentraltaiwan
AT chaowencheng explainablemachinelearningtopredictlongtermmortalityincriticallyillventilatedpatientsaretrospectivestudyincentraltaiwan