Cargando…
Dual attention network for unsupervised medical image registration based on VoxelMorph
An accurate medical image registration is crucial in a variety of neuroscience and clinical studies. In this paper, we proposed a new unsupervised learning network, DAVoxelMorph to improve the accuracy of 3D deformable medical image registration. Based on the VoxelMorph model, our network presented...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9519746/ https://www.ncbi.nlm.nih.gov/pubmed/36171468 http://dx.doi.org/10.1038/s41598-022-20589-7 |
_version_ | 1784799469626195968 |
---|---|
author | Li, Yong-xin Tang, Hui Wang, Wei Zhang, Xiu-feng Qu, Hang |
author_facet | Li, Yong-xin Tang, Hui Wang, Wei Zhang, Xiu-feng Qu, Hang |
author_sort | Li, Yong-xin |
collection | PubMed |
description | An accurate medical image registration is crucial in a variety of neuroscience and clinical studies. In this paper, we proposed a new unsupervised learning network, DAVoxelMorph to improve the accuracy of 3D deformable medical image registration. Based on the VoxelMorph model, our network presented two modifications, one is adding a dual attention architecture, specifically, we model semantic correlation on spatial and coordinate dimensions respectively, and the location attention module selectively aggregates the features of each location by weighting the features of all locations. The coordinate attention module further puts the location information into the channel attention. The other is introducing the bending penalty as regularization in the loss function to penalize the bending in the deformation field. Experimental results show that DAVoxelMorph achieved better registration performance including average Dice scores (0.714) and percentage of locations with non-positive Jacobian (0.345) compare with VoxelMorph (0.703, 0.355), CycleMorph (0.705, 0.133), ANTs SyN (0.707, 0.137) and NiftyReg (0.694, 0.549). Our model increases both model sensitivity and registration accuracy. |
format | Online Article Text |
id | pubmed-9519746 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-95197462022-09-30 Dual attention network for unsupervised medical image registration based on VoxelMorph Li, Yong-xin Tang, Hui Wang, Wei Zhang, Xiu-feng Qu, Hang Sci Rep Article An accurate medical image registration is crucial in a variety of neuroscience and clinical studies. In this paper, we proposed a new unsupervised learning network, DAVoxelMorph to improve the accuracy of 3D deformable medical image registration. Based on the VoxelMorph model, our network presented two modifications, one is adding a dual attention architecture, specifically, we model semantic correlation on spatial and coordinate dimensions respectively, and the location attention module selectively aggregates the features of each location by weighting the features of all locations. The coordinate attention module further puts the location information into the channel attention. The other is introducing the bending penalty as regularization in the loss function to penalize the bending in the deformation field. Experimental results show that DAVoxelMorph achieved better registration performance including average Dice scores (0.714) and percentage of locations with non-positive Jacobian (0.345) compare with VoxelMorph (0.703, 0.355), CycleMorph (0.705, 0.133), ANTs SyN (0.707, 0.137) and NiftyReg (0.694, 0.549). Our model increases both model sensitivity and registration accuracy. Nature Publishing Group UK 2022-09-28 /pmc/articles/PMC9519746/ /pubmed/36171468 http://dx.doi.org/10.1038/s41598-022-20589-7 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Article Li, Yong-xin Tang, Hui Wang, Wei Zhang, Xiu-feng Qu, Hang Dual attention network for unsupervised medical image registration based on VoxelMorph |
title | Dual attention network for unsupervised medical image registration based on VoxelMorph |
title_full | Dual attention network for unsupervised medical image registration based on VoxelMorph |
title_fullStr | Dual attention network for unsupervised medical image registration based on VoxelMorph |
title_full_unstemmed | Dual attention network for unsupervised medical image registration based on VoxelMorph |
title_short | Dual attention network for unsupervised medical image registration based on VoxelMorph |
title_sort | dual attention network for unsupervised medical image registration based on voxelmorph |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9519746/ https://www.ncbi.nlm.nih.gov/pubmed/36171468 http://dx.doi.org/10.1038/s41598-022-20589-7 |
work_keys_str_mv | AT liyongxin dualattentionnetworkforunsupervisedmedicalimageregistrationbasedonvoxelmorph AT tanghui dualattentionnetworkforunsupervisedmedicalimageregistrationbasedonvoxelmorph AT wangwei dualattentionnetworkforunsupervisedmedicalimageregistrationbasedonvoxelmorph AT zhangxiufeng dualattentionnetworkforunsupervisedmedicalimageregistrationbasedonvoxelmorph AT quhang dualattentionnetworkforunsupervisedmedicalimageregistrationbasedonvoxelmorph |