Cargando…

Dual-branch hybrid network for lesion segmentation in gastric cancer images

The effective segmentation of the lesion region in gastric cancer images can assist physicians in diagnosing and reducing the probability of misdiagnosis. The U-Net has been proven to provide segmentation results comparable to specialists in medical image segmentation because of its ability to extra...

Descripción completa

Detalles Bibliográficos
Autores principales: He, Dongzhi, Zhang, Yuanyu, Huang, Hui, Si, Yuhang, Wang, Zhiqiang, Li, Yunqi
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10115814/
https://www.ncbi.nlm.nih.gov/pubmed/37076573
http://dx.doi.org/10.1038/s41598-023-33462-y
_version_ 1785028288005013504
author He, Dongzhi
Zhang, Yuanyu
Huang, Hui
Si, Yuhang
Wang, Zhiqiang
Li, Yunqi
author_facet He, Dongzhi
Zhang, Yuanyu
Huang, Hui
Si, Yuhang
Wang, Zhiqiang
Li, Yunqi
author_sort He, Dongzhi
collection PubMed
description The effective segmentation of the lesion region in gastric cancer images can assist physicians in diagnosing and reducing the probability of misdiagnosis. The U-Net has been proven to provide segmentation results comparable to specialists in medical image segmentation because of its ability to extract high-level semantic information. However, it has limitations in obtaining global contextual information. On the other hand, the Transformer excels at modeling explicit long-range relations but cannot capture low-level detail information. Hence, this paper proposes a Dual-Branch Hybrid Network based on the fusion Transformer and U-Net to overcome both limitations. We propose the Deep Feature Aggregation Decoder (DFA) by aggregating only the in-depth features to obtain salient lesion features for both branches and reduce the complexity of the model. Besides, we design a Feature Fusion (FF) module utilizing the multi-modal fusion mechanisms to interact with independent features of various modalities and the linear Hadamard product to fuse the feature information extracted from both branches. Finally, the Transformer loss, the U-Net loss, and the fused loss are compared to the ground truth label for joint training. Experimental results show that our proposed method has an IOU of 81.3%, a Dice coefficient of 89.5%, and an Accuracy of 94.0%. These metrics demonstrate that our model outperforms the existing models in obtaining high-quality segmentation results, which has excellent potential for clinical analysis and diagnosis. The code and implementation details are available at Github, https://github.com/ZYY01/DBH-Net/.
format Online
Article
Text
id pubmed-10115814
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-101158142023-04-21 Dual-branch hybrid network for lesion segmentation in gastric cancer images He, Dongzhi Zhang, Yuanyu Huang, Hui Si, Yuhang Wang, Zhiqiang Li, Yunqi Sci Rep Article The effective segmentation of the lesion region in gastric cancer images can assist physicians in diagnosing and reducing the probability of misdiagnosis. The U-Net has been proven to provide segmentation results comparable to specialists in medical image segmentation because of its ability to extract high-level semantic information. However, it has limitations in obtaining global contextual information. On the other hand, the Transformer excels at modeling explicit long-range relations but cannot capture low-level detail information. Hence, this paper proposes a Dual-Branch Hybrid Network based on the fusion Transformer and U-Net to overcome both limitations. We propose the Deep Feature Aggregation Decoder (DFA) by aggregating only the in-depth features to obtain salient lesion features for both branches and reduce the complexity of the model. Besides, we design a Feature Fusion (FF) module utilizing the multi-modal fusion mechanisms to interact with independent features of various modalities and the linear Hadamard product to fuse the feature information extracted from both branches. Finally, the Transformer loss, the U-Net loss, and the fused loss are compared to the ground truth label for joint training. Experimental results show that our proposed method has an IOU of 81.3%, a Dice coefficient of 89.5%, and an Accuracy of 94.0%. These metrics demonstrate that our model outperforms the existing models in obtaining high-quality segmentation results, which has excellent potential for clinical analysis and diagnosis. The code and implementation details are available at Github, https://github.com/ZYY01/DBH-Net/. Nature Publishing Group UK 2023-04-19 /pmc/articles/PMC10115814/ /pubmed/37076573 http://dx.doi.org/10.1038/s41598-023-33462-y Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
He, Dongzhi
Zhang, Yuanyu
Huang, Hui
Si, Yuhang
Wang, Zhiqiang
Li, Yunqi
Dual-branch hybrid network for lesion segmentation in gastric cancer images
title Dual-branch hybrid network for lesion segmentation in gastric cancer images
title_full Dual-branch hybrid network for lesion segmentation in gastric cancer images
title_fullStr Dual-branch hybrid network for lesion segmentation in gastric cancer images
title_full_unstemmed Dual-branch hybrid network for lesion segmentation in gastric cancer images
title_short Dual-branch hybrid network for lesion segmentation in gastric cancer images
title_sort dual-branch hybrid network for lesion segmentation in gastric cancer images
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10115814/
https://www.ncbi.nlm.nih.gov/pubmed/37076573
http://dx.doi.org/10.1038/s41598-023-33462-y
work_keys_str_mv AT hedongzhi dualbranchhybridnetworkforlesionsegmentationingastriccancerimages
AT zhangyuanyu dualbranchhybridnetworkforlesionsegmentationingastriccancerimages
AT huanghui dualbranchhybridnetworkforlesionsegmentationingastriccancerimages
AT siyuhang dualbranchhybridnetworkforlesionsegmentationingastriccancerimages
AT wangzhiqiang dualbranchhybridnetworkforlesionsegmentationingastriccancerimages
AT liyunqi dualbranchhybridnetworkforlesionsegmentationingastriccancerimages