Cargando…

Split-Attention U-Net: A Fully Convolutional Network for Robust Multi-Label Segmentation from Brain MRI

Multi-label brain segmentation from brain magnetic resonance imaging (MRI) provides valuable structural information for most neurological analyses. Due to the complexity of the brain segmentation algorithm, it could delay the delivery of neuroimaging findings. Therefore, we introduce Split-Attention...

Descripción completa

Detalles Bibliográficos
Autores principales: Lee, Minho, Kim, JeeYoung, EY Kim, Regina, Kim, Hyun Gi, Oh, Se Won, Lee, Min Kyoung, Wang, Sheng-Min, Kim, Nak-Young, Kang, Dong Woo, Rieu, ZunHyan, Yong, Jung Hyun, Kim, Donghyeon, Lim, Hyun Kook
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7764312/
https://www.ncbi.nlm.nih.gov/pubmed/33322640
http://dx.doi.org/10.3390/brainsci10120974
_version_ 1783628226898165760
author Lee, Minho
Kim, JeeYoung
EY Kim, Regina
Kim, Hyun Gi
Oh, Se Won
Lee, Min Kyoung
Wang, Sheng-Min
Kim, Nak-Young
Kang, Dong Woo
Rieu, ZunHyan
Yong, Jung Hyun
Kim, Donghyeon
Lim, Hyun Kook
author_facet Lee, Minho
Kim, JeeYoung
EY Kim, Regina
Kim, Hyun Gi
Oh, Se Won
Lee, Min Kyoung
Wang, Sheng-Min
Kim, Nak-Young
Kang, Dong Woo
Rieu, ZunHyan
Yong, Jung Hyun
Kim, Donghyeon
Lim, Hyun Kook
author_sort Lee, Minho
collection PubMed
description Multi-label brain segmentation from brain magnetic resonance imaging (MRI) provides valuable structural information for most neurological analyses. Due to the complexity of the brain segmentation algorithm, it could delay the delivery of neuroimaging findings. Therefore, we introduce Split-Attention U-Net (SAU-Net), a convolutional neural network with skip pathways and a split-attention module that segments brain MRI scans. The proposed architecture employs split-attention blocks, skip pathways with pyramid levels, and evolving normalization layers. For efficient training, we performed pre-training and fine-tuning with the original and manually modified FreeSurfer labels, respectively. This learning strategy enables involvement of heterogeneous neuroimaging data in the training without the need for many manual annotations. Using nine evaluation datasets, we demonstrated that SAU-Net achieved better segmentation accuracy with better reliability that surpasses those of state-of-the-art methods. We believe that SAU-Net has excellent potential due to its robustness to neuroanatomical variability that would enable almost instantaneous access to accurate neuroimaging biomarkers and its swift processing runtime compared to other methods investigated.
format Online
Article
Text
id pubmed-7764312
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-77643122020-12-27 Split-Attention U-Net: A Fully Convolutional Network for Robust Multi-Label Segmentation from Brain MRI Lee, Minho Kim, JeeYoung EY Kim, Regina Kim, Hyun Gi Oh, Se Won Lee, Min Kyoung Wang, Sheng-Min Kim, Nak-Young Kang, Dong Woo Rieu, ZunHyan Yong, Jung Hyun Kim, Donghyeon Lim, Hyun Kook Brain Sci Article Multi-label brain segmentation from brain magnetic resonance imaging (MRI) provides valuable structural information for most neurological analyses. Due to the complexity of the brain segmentation algorithm, it could delay the delivery of neuroimaging findings. Therefore, we introduce Split-Attention U-Net (SAU-Net), a convolutional neural network with skip pathways and a split-attention module that segments brain MRI scans. The proposed architecture employs split-attention blocks, skip pathways with pyramid levels, and evolving normalization layers. For efficient training, we performed pre-training and fine-tuning with the original and manually modified FreeSurfer labels, respectively. This learning strategy enables involvement of heterogeneous neuroimaging data in the training without the need for many manual annotations. Using nine evaluation datasets, we demonstrated that SAU-Net achieved better segmentation accuracy with better reliability that surpasses those of state-of-the-art methods. We believe that SAU-Net has excellent potential due to its robustness to neuroanatomical variability that would enable almost instantaneous access to accurate neuroimaging biomarkers and its swift processing runtime compared to other methods investigated. MDPI 2020-12-11 /pmc/articles/PMC7764312/ /pubmed/33322640 http://dx.doi.org/10.3390/brainsci10120974 Text en © 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Lee, Minho
Kim, JeeYoung
EY Kim, Regina
Kim, Hyun Gi
Oh, Se Won
Lee, Min Kyoung
Wang, Sheng-Min
Kim, Nak-Young
Kang, Dong Woo
Rieu, ZunHyan
Yong, Jung Hyun
Kim, Donghyeon
Lim, Hyun Kook
Split-Attention U-Net: A Fully Convolutional Network for Robust Multi-Label Segmentation from Brain MRI
title Split-Attention U-Net: A Fully Convolutional Network for Robust Multi-Label Segmentation from Brain MRI
title_full Split-Attention U-Net: A Fully Convolutional Network for Robust Multi-Label Segmentation from Brain MRI
title_fullStr Split-Attention U-Net: A Fully Convolutional Network for Robust Multi-Label Segmentation from Brain MRI
title_full_unstemmed Split-Attention U-Net: A Fully Convolutional Network for Robust Multi-Label Segmentation from Brain MRI
title_short Split-Attention U-Net: A Fully Convolutional Network for Robust Multi-Label Segmentation from Brain MRI
title_sort split-attention u-net: a fully convolutional network for robust multi-label segmentation from brain mri
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7764312/
https://www.ncbi.nlm.nih.gov/pubmed/33322640
http://dx.doi.org/10.3390/brainsci10120974
work_keys_str_mv AT leeminho splitattentionunetafullyconvolutionalnetworkforrobustmultilabelsegmentationfrombrainmri
AT kimjeeyoung splitattentionunetafullyconvolutionalnetworkforrobustmultilabelsegmentationfrombrainmri
AT eykimregina splitattentionunetafullyconvolutionalnetworkforrobustmultilabelsegmentationfrombrainmri
AT kimhyungi splitattentionunetafullyconvolutionalnetworkforrobustmultilabelsegmentationfrombrainmri
AT ohsewon splitattentionunetafullyconvolutionalnetworkforrobustmultilabelsegmentationfrombrainmri
AT leeminkyoung splitattentionunetafullyconvolutionalnetworkforrobustmultilabelsegmentationfrombrainmri
AT wangshengmin splitattentionunetafullyconvolutionalnetworkforrobustmultilabelsegmentationfrombrainmri
AT kimnakyoung splitattentionunetafullyconvolutionalnetworkforrobustmultilabelsegmentationfrombrainmri
AT kangdongwoo splitattentionunetafullyconvolutionalnetworkforrobustmultilabelsegmentationfrombrainmri
AT rieuzunhyan splitattentionunetafullyconvolutionalnetworkforrobustmultilabelsegmentationfrombrainmri
AT yongjunghyun splitattentionunetafullyconvolutionalnetworkforrobustmultilabelsegmentationfrombrainmri
AT kimdonghyeon splitattentionunetafullyconvolutionalnetworkforrobustmultilabelsegmentationfrombrainmri
AT limhyunkook splitattentionunetafullyconvolutionalnetworkforrobustmultilabelsegmentationfrombrainmri