Cargando…

ARA-net: an attention-aware retinal atrophy segmentation network coping with fundus images

BACKGROUND: Accurately detecting and segmenting areas of retinal atrophy are paramount for early medical intervention in pathological myopia (PM). However, segmenting retinal atrophic areas based on a two-dimensional (2D) fundus image poses several challenges, such as blurred boundaries, irregular s...

Descripción completa

Detalles Bibliográficos
Autores principales: Chen, Lei, Zhou, Yuying, Gao, Songyang, Li, Manyu, Tan, Hai, Wan, Zhijiang
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10174230/
https://www.ncbi.nlm.nih.gov/pubmed/37179557
http://dx.doi.org/10.3389/fnins.2023.1174937
_version_ 1785039985613733888
author Chen, Lei
Zhou, Yuying
Gao, Songyang
Li, Manyu
Tan, Hai
Wan, Zhijiang
author_facet Chen, Lei
Zhou, Yuying
Gao, Songyang
Li, Manyu
Tan, Hai
Wan, Zhijiang
author_sort Chen, Lei
collection PubMed
description BACKGROUND: Accurately detecting and segmenting areas of retinal atrophy are paramount for early medical intervention in pathological myopia (PM). However, segmenting retinal atrophic areas based on a two-dimensional (2D) fundus image poses several challenges, such as blurred boundaries, irregular shapes, and size variation. To overcome these challenges, we have proposed an attention-aware retinal atrophy segmentation network (ARA-Net) to segment retinal atrophy areas from the 2D fundus image. METHODS: In particular, the ARA-Net adopts a similar strategy as UNet to perform the area segmentation. Skip self-attention connection (SSA) block, comprising a shortcut and a parallel polarized self-attention (PPSA) block, has been proposed to deal with the challenges of blurred boundaries and irregular shapes of the retinal atrophic region. Further, we have proposed a multi-scale feature flow (MSFF) to challenge the size variation. We have added the flow between the SSA connection blocks, allowing for capturing considerable semantic information to detect retinal atrophy in various area sizes. RESULTS: The proposed method has been validated on the Pathological Myopia (PALM) dataset. Experimental results demonstrate that our method yields a high dice coefficient (DICE) of 84.26%, Jaccard index (JAC) of 72.80%, and F1-score of 84.57%, which outperforms other methods significantly. CONCLUSION: Our results have demonstrated that ARA-Net is an effective and efficient approach for retinal atrophic area segmentation in PM.
format Online
Article
Text
id pubmed-10174230
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-101742302023-05-12 ARA-net: an attention-aware retinal atrophy segmentation network coping with fundus images Chen, Lei Zhou, Yuying Gao, Songyang Li, Manyu Tan, Hai Wan, Zhijiang Front Neurosci Neuroscience BACKGROUND: Accurately detecting and segmenting areas of retinal atrophy are paramount for early medical intervention in pathological myopia (PM). However, segmenting retinal atrophic areas based on a two-dimensional (2D) fundus image poses several challenges, such as blurred boundaries, irregular shapes, and size variation. To overcome these challenges, we have proposed an attention-aware retinal atrophy segmentation network (ARA-Net) to segment retinal atrophy areas from the 2D fundus image. METHODS: In particular, the ARA-Net adopts a similar strategy as UNet to perform the area segmentation. Skip self-attention connection (SSA) block, comprising a shortcut and a parallel polarized self-attention (PPSA) block, has been proposed to deal with the challenges of blurred boundaries and irregular shapes of the retinal atrophic region. Further, we have proposed a multi-scale feature flow (MSFF) to challenge the size variation. We have added the flow between the SSA connection blocks, allowing for capturing considerable semantic information to detect retinal atrophy in various area sizes. RESULTS: The proposed method has been validated on the Pathological Myopia (PALM) dataset. Experimental results demonstrate that our method yields a high dice coefficient (DICE) of 84.26%, Jaccard index (JAC) of 72.80%, and F1-score of 84.57%, which outperforms other methods significantly. CONCLUSION: Our results have demonstrated that ARA-Net is an effective and efficient approach for retinal atrophic area segmentation in PM. Frontiers Media S.A. 2023-04-27 /pmc/articles/PMC10174230/ /pubmed/37179557 http://dx.doi.org/10.3389/fnins.2023.1174937 Text en Copyright © 2023 Chen, Zhou, Gao, Li, Tan and Wan. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Chen, Lei
Zhou, Yuying
Gao, Songyang
Li, Manyu
Tan, Hai
Wan, Zhijiang
ARA-net: an attention-aware retinal atrophy segmentation network coping with fundus images
title ARA-net: an attention-aware retinal atrophy segmentation network coping with fundus images
title_full ARA-net: an attention-aware retinal atrophy segmentation network coping with fundus images
title_fullStr ARA-net: an attention-aware retinal atrophy segmentation network coping with fundus images
title_full_unstemmed ARA-net: an attention-aware retinal atrophy segmentation network coping with fundus images
title_short ARA-net: an attention-aware retinal atrophy segmentation network coping with fundus images
title_sort ara-net: an attention-aware retinal atrophy segmentation network coping with fundus images
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10174230/
https://www.ncbi.nlm.nih.gov/pubmed/37179557
http://dx.doi.org/10.3389/fnins.2023.1174937
work_keys_str_mv AT chenlei aranetanattentionawareretinalatrophysegmentationnetworkcopingwithfundusimages
AT zhouyuying aranetanattentionawareretinalatrophysegmentationnetworkcopingwithfundusimages
AT gaosongyang aranetanattentionawareretinalatrophysegmentationnetworkcopingwithfundusimages
AT limanyu aranetanattentionawareretinalatrophysegmentationnetworkcopingwithfundusimages
AT tanhai aranetanattentionawareretinalatrophysegmentationnetworkcopingwithfundusimages
AT wanzhijiang aranetanattentionawareretinalatrophysegmentationnetworkcopingwithfundusimages