Cargando…

MM-UNet: A multimodality brain tumor segmentation network in MRI images

The global annual incidence of brain tumors is approximately seven out of 100,000, accounting for 2% of all tumors. The mortality rate ranks first among children under 12 and 10th among adults. Therefore, the localization and segmentation of brain tumor images constitute an active field of medical r...

Descripción completa

Detalles Bibliográficos
Autores principales: Zhao, Liang, Ma, Jiajun, Shao, Yu, Jia, Chaoran, Zhao, Jingyuan, Yuan, Hong
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9434799/
https://www.ncbi.nlm.nih.gov/pubmed/36059677
http://dx.doi.org/10.3389/fonc.2022.950706
_version_ 1784780964755406848
author Zhao, Liang
Ma, Jiajun
Shao, Yu
Jia, Chaoran
Zhao, Jingyuan
Yuan, Hong
author_facet Zhao, Liang
Ma, Jiajun
Shao, Yu
Jia, Chaoran
Zhao, Jingyuan
Yuan, Hong
author_sort Zhao, Liang
collection PubMed
description The global annual incidence of brain tumors is approximately seven out of 100,000, accounting for 2% of all tumors. The mortality rate ranks first among children under 12 and 10th among adults. Therefore, the localization and segmentation of brain tumor images constitute an active field of medical research. The traditional manual segmentation method is time-consuming, laborious, and subjective. In addition, the information provided by a single-image modality is often limited and cannot meet the needs of clinical application. Therefore, in this study, we developed a multimodality feature fusion network, MM-UNet, for brain tumor segmentation by adopting a multi-encoder and single-decoder structure. In the proposed network, each encoder independently extracts low-level features from the corresponding imaging modality, and the hybrid attention block strengthens the features. After fusion with the high-level semantic of the decoder path through skip connection, the decoder restores the pixel-level segmentation results. We evaluated the performance of the proposed model on the BraTS 2020 dataset. MM-UNet achieved the mean Dice score of 79.2% and mean Hausdorff distance of 8.466, which is a consistent performance improvement over the U-Net, Attention U-Net, and ResUNet baseline models and demonstrates the effectiveness of the proposed model.
format Online
Article
Text
id pubmed-9434799
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-94347992022-09-02 MM-UNet: A multimodality brain tumor segmentation network in MRI images Zhao, Liang Ma, Jiajun Shao, Yu Jia, Chaoran Zhao, Jingyuan Yuan, Hong Front Oncol Oncology The global annual incidence of brain tumors is approximately seven out of 100,000, accounting for 2% of all tumors. The mortality rate ranks first among children under 12 and 10th among adults. Therefore, the localization and segmentation of brain tumor images constitute an active field of medical research. The traditional manual segmentation method is time-consuming, laborious, and subjective. In addition, the information provided by a single-image modality is often limited and cannot meet the needs of clinical application. Therefore, in this study, we developed a multimodality feature fusion network, MM-UNet, for brain tumor segmentation by adopting a multi-encoder and single-decoder structure. In the proposed network, each encoder independently extracts low-level features from the corresponding imaging modality, and the hybrid attention block strengthens the features. After fusion with the high-level semantic of the decoder path through skip connection, the decoder restores the pixel-level segmentation results. We evaluated the performance of the proposed model on the BraTS 2020 dataset. MM-UNet achieved the mean Dice score of 79.2% and mean Hausdorff distance of 8.466, which is a consistent performance improvement over the U-Net, Attention U-Net, and ResUNet baseline models and demonstrates the effectiveness of the proposed model. Frontiers Media S.A. 2022-08-18 /pmc/articles/PMC9434799/ /pubmed/36059677 http://dx.doi.org/10.3389/fonc.2022.950706 Text en Copyright © 2022 Zhao, Ma, Shao, Jia, Zhao and Yuan https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Oncology
Zhao, Liang
Ma, Jiajun
Shao, Yu
Jia, Chaoran
Zhao, Jingyuan
Yuan, Hong
MM-UNet: A multimodality brain tumor segmentation network in MRI images
title MM-UNet: A multimodality brain tumor segmentation network in MRI images
title_full MM-UNet: A multimodality brain tumor segmentation network in MRI images
title_fullStr MM-UNet: A multimodality brain tumor segmentation network in MRI images
title_full_unstemmed MM-UNet: A multimodality brain tumor segmentation network in MRI images
title_short MM-UNet: A multimodality brain tumor segmentation network in MRI images
title_sort mm-unet: a multimodality brain tumor segmentation network in mri images
topic Oncology
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9434799/
https://www.ncbi.nlm.nih.gov/pubmed/36059677
http://dx.doi.org/10.3389/fonc.2022.950706
work_keys_str_mv AT zhaoliang mmunetamultimodalitybraintumorsegmentationnetworkinmriimages
AT majiajun mmunetamultimodalitybraintumorsegmentationnetworkinmriimages
AT shaoyu mmunetamultimodalitybraintumorsegmentationnetworkinmriimages
AT jiachaoran mmunetamultimodalitybraintumorsegmentationnetworkinmriimages
AT zhaojingyuan mmunetamultimodalitybraintumorsegmentationnetworkinmriimages
AT yuanhong mmunetamultimodalitybraintumorsegmentationnetworkinmriimages