Cargando…

Inter-fraction deformable image registration using unsupervised deep learning for CBCT-guided abdominal radiotherapy

Objective. CBCTs in image-guided radiotherapy provide crucial anatomy information for patient setup and plan evaluation. Longitudinal CBCT image registration could quantify the inter-fractional anatomic changes, e.g. tumor shrinkage, and daily OAR variation throughout the course of treatment. The pu...

Descripción completa

Detalles Bibliográficos
Autores principales: Xie, Huiqiao, Lei, Yang, Fu, Yabo, Wang, Tonghe, Roper, Justin, Bradley, Jeffrey D, Patel, Pretesh, Liu, Tian, Yang, Xiaofeng
Formato: Online Artículo Texto
Lenguaje:English
Publicado: IOP Publishing 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10099091/
https://www.ncbi.nlm.nih.gov/pubmed/36958049
http://dx.doi.org/10.1088/1361-6560/acc721
_version_ 1785024974397898752
author Xie, Huiqiao
Lei, Yang
Fu, Yabo
Wang, Tonghe
Roper, Justin
Bradley, Jeffrey D
Patel, Pretesh
Liu, Tian
Yang, Xiaofeng
author_facet Xie, Huiqiao
Lei, Yang
Fu, Yabo
Wang, Tonghe
Roper, Justin
Bradley, Jeffrey D
Patel, Pretesh
Liu, Tian
Yang, Xiaofeng
author_sort Xie, Huiqiao
collection PubMed
description Objective. CBCTs in image-guided radiotherapy provide crucial anatomy information for patient setup and plan evaluation. Longitudinal CBCT image registration could quantify the inter-fractional anatomic changes, e.g. tumor shrinkage, and daily OAR variation throughout the course of treatment. The purpose of this study is to propose an unsupervised deep learning-based CBCT-CBCT deformable image registration which enables quantitative anatomic variation analysis. Approach. The proposed deformable registration workflow consists of training and inference stages that share the same feed-forward path through a spatial transformation-based network (STN). The STN consists of a global generative adversarial network (GlobalGAN) and a local GAN (LocalGAN) to predict the coarse- and fine-scale motions, respectively. The network was trained by minimizing the image similarity loss and the deformable vector field (DVF) regularization loss without the supervision of ground truth DVFs. During the inference stage, patches of local DVF were predicted by the trained LocalGAN and fused to form a whole-image DVF. The local whole-image DVF was subsequently combined with the GlobalGAN generated DVF to obtain the final DVF. The proposed method was evaluated using 100 fractional CBCTs from 20 abdominal cancer patients in the experiments and 105 fractional CBCTs from a cohort of 21 different abdominal cancer patients in a holdout test. Main Results. Qualitatively, the registration results show good alignment between the deformed CBCT images and the target CBCT image. Quantitatively, the average target registration error calculated on the fiducial markers and manually identified landmarks was 1.91 ± 1.18 mm. The average mean absolute error, normalized cross correlation between the deformed CBCT and target CBCT were 33.42 ± 7.48 HU, 0.94 ± 0.04, respectively. Significance. In summary, an unsupervised deep learning-based CBCT-CBCT registration method is proposed and its feasibility and performance in fractionated image-guided radiotherapy is investigated. This promising registration method could provide fast and accurate longitudinal CBCT alignment to facilitate inter-fractional anatomic changes analysis and prediction.
format Online
Article
Text
id pubmed-10099091
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher IOP Publishing
record_format MEDLINE/PubMed
spelling pubmed-100990912023-04-14 Inter-fraction deformable image registration using unsupervised deep learning for CBCT-guided abdominal radiotherapy Xie, Huiqiao Lei, Yang Fu, Yabo Wang, Tonghe Roper, Justin Bradley, Jeffrey D Patel, Pretesh Liu, Tian Yang, Xiaofeng Phys Med Biol Paper Objective. CBCTs in image-guided radiotherapy provide crucial anatomy information for patient setup and plan evaluation. Longitudinal CBCT image registration could quantify the inter-fractional anatomic changes, e.g. tumor shrinkage, and daily OAR variation throughout the course of treatment. The purpose of this study is to propose an unsupervised deep learning-based CBCT-CBCT deformable image registration which enables quantitative anatomic variation analysis. Approach. The proposed deformable registration workflow consists of training and inference stages that share the same feed-forward path through a spatial transformation-based network (STN). The STN consists of a global generative adversarial network (GlobalGAN) and a local GAN (LocalGAN) to predict the coarse- and fine-scale motions, respectively. The network was trained by minimizing the image similarity loss and the deformable vector field (DVF) regularization loss without the supervision of ground truth DVFs. During the inference stage, patches of local DVF were predicted by the trained LocalGAN and fused to form a whole-image DVF. The local whole-image DVF was subsequently combined with the GlobalGAN generated DVF to obtain the final DVF. The proposed method was evaluated using 100 fractional CBCTs from 20 abdominal cancer patients in the experiments and 105 fractional CBCTs from a cohort of 21 different abdominal cancer patients in a holdout test. Main Results. Qualitatively, the registration results show good alignment between the deformed CBCT images and the target CBCT image. Quantitatively, the average target registration error calculated on the fiducial markers and manually identified landmarks was 1.91 ± 1.18 mm. The average mean absolute error, normalized cross correlation between the deformed CBCT and target CBCT were 33.42 ± 7.48 HU, 0.94 ± 0.04, respectively. Significance. In summary, an unsupervised deep learning-based CBCT-CBCT registration method is proposed and its feasibility and performance in fractionated image-guided radiotherapy is investigated. This promising registration method could provide fast and accurate longitudinal CBCT alignment to facilitate inter-fractional anatomic changes analysis and prediction. IOP Publishing 2023-05-07 2023-04-13 /pmc/articles/PMC10099091/ /pubmed/36958049 http://dx.doi.org/10.1088/1361-6560/acc721 Text en © 2023 The Author(s). Published on behalf of Institute of Physics and Engineering in Medicine by IOP Publishing Ltd https://creativecommons.org/licenses/by/4.0/Original content from this work may be used under the terms of the Creative Commons Attribution 4.0 licence (https://creativecommons.org/licenses/by/4.0/) . Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.
spellingShingle Paper
Xie, Huiqiao
Lei, Yang
Fu, Yabo
Wang, Tonghe
Roper, Justin
Bradley, Jeffrey D
Patel, Pretesh
Liu, Tian
Yang, Xiaofeng
Inter-fraction deformable image registration using unsupervised deep learning for CBCT-guided abdominal radiotherapy
title Inter-fraction deformable image registration using unsupervised deep learning for CBCT-guided abdominal radiotherapy
title_full Inter-fraction deformable image registration using unsupervised deep learning for CBCT-guided abdominal radiotherapy
title_fullStr Inter-fraction deformable image registration using unsupervised deep learning for CBCT-guided abdominal radiotherapy
title_full_unstemmed Inter-fraction deformable image registration using unsupervised deep learning for CBCT-guided abdominal radiotherapy
title_short Inter-fraction deformable image registration using unsupervised deep learning for CBCT-guided abdominal radiotherapy
title_sort inter-fraction deformable image registration using unsupervised deep learning for cbct-guided abdominal radiotherapy
topic Paper
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10099091/
https://www.ncbi.nlm.nih.gov/pubmed/36958049
http://dx.doi.org/10.1088/1361-6560/acc721
work_keys_str_mv AT xiehuiqiao interfractiondeformableimageregistrationusingunsuperviseddeeplearningforcbctguidedabdominalradiotherapy
AT leiyang interfractiondeformableimageregistrationusingunsuperviseddeeplearningforcbctguidedabdominalradiotherapy
AT fuyabo interfractiondeformableimageregistrationusingunsuperviseddeeplearningforcbctguidedabdominalradiotherapy
AT wangtonghe interfractiondeformableimageregistrationusingunsuperviseddeeplearningforcbctguidedabdominalradiotherapy
AT roperjustin interfractiondeformableimageregistrationusingunsuperviseddeeplearningforcbctguidedabdominalradiotherapy
AT bradleyjeffreyd interfractiondeformableimageregistrationusingunsuperviseddeeplearningforcbctguidedabdominalradiotherapy
AT patelpretesh interfractiondeformableimageregistrationusingunsuperviseddeeplearningforcbctguidedabdominalradiotherapy
AT liutian interfractiondeformableimageregistrationusingunsuperviseddeeplearningforcbctguidedabdominalradiotherapy
AT yangxiaofeng interfractiondeformableimageregistrationusingunsuperviseddeeplearningforcbctguidedabdominalradiotherapy