Cargando…
Deep learning-based 2D/3D registration of an atlas to biplanar X-ray images
PURPOSE: The registration of a 3D atlas image to 2D radiographs enables 3D pre-operative planning without the need to acquire costly and high-dose CT-scans. Recently, many deep-learning-based 2D/3D registration methods have been proposed which tackle the problem as a reconstruction by regressing the...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Springer International Publishing
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9206611/ https://www.ncbi.nlm.nih.gov/pubmed/35294717 http://dx.doi.org/10.1007/s11548-022-02586-3 |
_version_ | 1784729368106369024 |
---|---|
author | Van Houtte, Jeroen Audenaert, Emmanuel Zheng, Guoyan Sijbers, Jan |
author_facet | Van Houtte, Jeroen Audenaert, Emmanuel Zheng, Guoyan Sijbers, Jan |
author_sort | Van Houtte, Jeroen |
collection | PubMed |
description | PURPOSE: The registration of a 3D atlas image to 2D radiographs enables 3D pre-operative planning without the need to acquire costly and high-dose CT-scans. Recently, many deep-learning-based 2D/3D registration methods have been proposed which tackle the problem as a reconstruction by regressing the 3D image immediately from the radiographs, rather than registering an atlas image. Consequently, they are less constrained against unfeasible reconstructions and have no possibility to warp auxiliary data. Finally, they are, by construction, limited to orthogonal projections. METHODS: We propose a novel end-to-end trainable 2D/3D registration network that regresses a dense deformation field that warps an atlas image such that the forward projection of the warped atlas matches the input 2D radiographs. We effectively take the projection matrix into account in the regression problem by integrating a projective and inverse projective spatial transform layer into the network. RESULTS: Comprehensive experiments conducted on simulated DRRs from patient CT images demonstrate the efficacy of the network. Our network yields an average Dice score of 0.94 and an average symmetric surface distance of 0.84 mm on our test dataset. It has experimentally been determined that projection geometries with 80[Formula: see text] to 100[Formula: see text] projection angle difference result in the highest accuracy. CONCLUSION: Our network is able to accurately reconstruct patient-specific CT-images from a pair of near-orthogonal calibrated radiographs by regressing a deformation field that warps an atlas image or any other auxiliary data. Our method is not constrained to orthogonal projections, increasing its applicability in medical practices. It remains a future task to extend the network for uncalibrated radiographs. |
format | Online Article Text |
id | pubmed-9206611 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Springer International Publishing |
record_format | MEDLINE/PubMed |
spelling | pubmed-92066112022-06-20 Deep learning-based 2D/3D registration of an atlas to biplanar X-ray images Van Houtte, Jeroen Audenaert, Emmanuel Zheng, Guoyan Sijbers, Jan Int J Comput Assist Radiol Surg Original Article PURPOSE: The registration of a 3D atlas image to 2D radiographs enables 3D pre-operative planning without the need to acquire costly and high-dose CT-scans. Recently, many deep-learning-based 2D/3D registration methods have been proposed which tackle the problem as a reconstruction by regressing the 3D image immediately from the radiographs, rather than registering an atlas image. Consequently, they are less constrained against unfeasible reconstructions and have no possibility to warp auxiliary data. Finally, they are, by construction, limited to orthogonal projections. METHODS: We propose a novel end-to-end trainable 2D/3D registration network that regresses a dense deformation field that warps an atlas image such that the forward projection of the warped atlas matches the input 2D radiographs. We effectively take the projection matrix into account in the regression problem by integrating a projective and inverse projective spatial transform layer into the network. RESULTS: Comprehensive experiments conducted on simulated DRRs from patient CT images demonstrate the efficacy of the network. Our network yields an average Dice score of 0.94 and an average symmetric surface distance of 0.84 mm on our test dataset. It has experimentally been determined that projection geometries with 80[Formula: see text] to 100[Formula: see text] projection angle difference result in the highest accuracy. CONCLUSION: Our network is able to accurately reconstruct patient-specific CT-images from a pair of near-orthogonal calibrated radiographs by regressing a deformation field that warps an atlas image or any other auxiliary data. Our method is not constrained to orthogonal projections, increasing its applicability in medical practices. It remains a future task to extend the network for uncalibrated radiographs. Springer International Publishing 2022-03-16 2022 /pmc/articles/PMC9206611/ /pubmed/35294717 http://dx.doi.org/10.1007/s11548-022-02586-3 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Original Article Van Houtte, Jeroen Audenaert, Emmanuel Zheng, Guoyan Sijbers, Jan Deep learning-based 2D/3D registration of an atlas to biplanar X-ray images |
title | Deep learning-based 2D/3D registration of an atlas to biplanar X-ray images |
title_full | Deep learning-based 2D/3D registration of an atlas to biplanar X-ray images |
title_fullStr | Deep learning-based 2D/3D registration of an atlas to biplanar X-ray images |
title_full_unstemmed | Deep learning-based 2D/3D registration of an atlas to biplanar X-ray images |
title_short | Deep learning-based 2D/3D registration of an atlas to biplanar X-ray images |
title_sort | deep learning-based 2d/3d registration of an atlas to biplanar x-ray images |
topic | Original Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9206611/ https://www.ncbi.nlm.nih.gov/pubmed/35294717 http://dx.doi.org/10.1007/s11548-022-02586-3 |
work_keys_str_mv | AT vanhouttejeroen deeplearningbased2d3dregistrationofanatlastobiplanarxrayimages AT audenaertemmanuel deeplearningbased2d3dregistrationofanatlastobiplanarxrayimages AT zhengguoyan deeplearningbased2d3dregistrationofanatlastobiplanarxrayimages AT sijbersjan deeplearningbased2d3dregistrationofanatlastobiplanarxrayimages |