Cargando…
Unified ICH quantification and prognosis prediction in NCCT images using a multi-task interpretable network
With the recent development of deep learning, the regression, classification, and segmentation tasks of Computer-Aided Diagnosis (CAD) using Non-Contrast head Computed Tomography (NCCT) for spontaneous IntraCerebral Hematoma (ICH) have become popular in the field of emergency medicine. However, a fe...
Autores principales: | , , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10043313/ https://www.ncbi.nlm.nih.gov/pubmed/36998725 http://dx.doi.org/10.3389/fnins.2023.1118340 |
_version_ | 1784913118082629632 |
---|---|
author | Gong, Kai Dai, Qian Wang, Jiacheng Zheng, Yingbin Shi, Tao Yu, Jiaxing Chen, Jiangwang Huang, Shaohui Wang, Zhanxiang |
author_facet | Gong, Kai Dai, Qian Wang, Jiacheng Zheng, Yingbin Shi, Tao Yu, Jiaxing Chen, Jiangwang Huang, Shaohui Wang, Zhanxiang |
author_sort | Gong, Kai |
collection | PubMed |
description | With the recent development of deep learning, the regression, classification, and segmentation tasks of Computer-Aided Diagnosis (CAD) using Non-Contrast head Computed Tomography (NCCT) for spontaneous IntraCerebral Hematoma (ICH) have become popular in the field of emergency medicine. However, a few challenges such as time-consuming of ICH volume manual evaluation, excessive cost demanding patient-level predictions, and the requirement for high performance in both accuracy and interpretability remain. This paper proposes a multi-task framework consisting of upstream and downstream components to overcome these challenges. In the upstream, a weight-shared module is trained as a robust feature extractor that captures global features by performing multi-tasks (regression and classification). In the downstream, two heads are used for two different tasks (regression and classification). The final experimental results show that the multi-task framework has better performance than single-task framework. And it also reflects its good interpretability in the heatmap generated by Gradient-weighted Class Activation Mapping (Grad-CAM), which is a widely used model interpretation method, and will be presented in subsequent sections. |
format | Online Article Text |
id | pubmed-10043313 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-100433132023-03-29 Unified ICH quantification and prognosis prediction in NCCT images using a multi-task interpretable network Gong, Kai Dai, Qian Wang, Jiacheng Zheng, Yingbin Shi, Tao Yu, Jiaxing Chen, Jiangwang Huang, Shaohui Wang, Zhanxiang Front Neurosci Neuroscience With the recent development of deep learning, the regression, classification, and segmentation tasks of Computer-Aided Diagnosis (CAD) using Non-Contrast head Computed Tomography (NCCT) for spontaneous IntraCerebral Hematoma (ICH) have become popular in the field of emergency medicine. However, a few challenges such as time-consuming of ICH volume manual evaluation, excessive cost demanding patient-level predictions, and the requirement for high performance in both accuracy and interpretability remain. This paper proposes a multi-task framework consisting of upstream and downstream components to overcome these challenges. In the upstream, a weight-shared module is trained as a robust feature extractor that captures global features by performing multi-tasks (regression and classification). In the downstream, two heads are used for two different tasks (regression and classification). The final experimental results show that the multi-task framework has better performance than single-task framework. And it also reflects its good interpretability in the heatmap generated by Gradient-weighted Class Activation Mapping (Grad-CAM), which is a widely used model interpretation method, and will be presented in subsequent sections. Frontiers Media S.A. 2023-03-14 /pmc/articles/PMC10043313/ /pubmed/36998725 http://dx.doi.org/10.3389/fnins.2023.1118340 Text en Copyright © 2023 Gong, Dai, Wang, Zheng, Shi, Yu, Chen, Huang and Wang. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Neuroscience Gong, Kai Dai, Qian Wang, Jiacheng Zheng, Yingbin Shi, Tao Yu, Jiaxing Chen, Jiangwang Huang, Shaohui Wang, Zhanxiang Unified ICH quantification and prognosis prediction in NCCT images using a multi-task interpretable network |
title | Unified ICH quantification and prognosis prediction in NCCT images using a multi-task interpretable network |
title_full | Unified ICH quantification and prognosis prediction in NCCT images using a multi-task interpretable network |
title_fullStr | Unified ICH quantification and prognosis prediction in NCCT images using a multi-task interpretable network |
title_full_unstemmed | Unified ICH quantification and prognosis prediction in NCCT images using a multi-task interpretable network |
title_short | Unified ICH quantification and prognosis prediction in NCCT images using a multi-task interpretable network |
title_sort | unified ich quantification and prognosis prediction in ncct images using a multi-task interpretable network |
topic | Neuroscience |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10043313/ https://www.ncbi.nlm.nih.gov/pubmed/36998725 http://dx.doi.org/10.3389/fnins.2023.1118340 |
work_keys_str_mv | AT gongkai unifiedichquantificationandprognosispredictioninncctimagesusingamultitaskinterpretablenetwork AT daiqian unifiedichquantificationandprognosispredictioninncctimagesusingamultitaskinterpretablenetwork AT wangjiacheng unifiedichquantificationandprognosispredictioninncctimagesusingamultitaskinterpretablenetwork AT zhengyingbin unifiedichquantificationandprognosispredictioninncctimagesusingamultitaskinterpretablenetwork AT shitao unifiedichquantificationandprognosispredictioninncctimagesusingamultitaskinterpretablenetwork AT yujiaxing unifiedichquantificationandprognosispredictioninncctimagesusingamultitaskinterpretablenetwork AT chenjiangwang unifiedichquantificationandprognosispredictioninncctimagesusingamultitaskinterpretablenetwork AT huangshaohui unifiedichquantificationandprognosispredictioninncctimagesusingamultitaskinterpretablenetwork AT wangzhanxiang unifiedichquantificationandprognosispredictioninncctimagesusingamultitaskinterpretablenetwork |