Cargando…

CBCT-to-CT Translation Using Registration-Based Generative Adversarial Networks in Patients with Head and Neck Cancer

SIMPLE SUMMARY: Cone-beam computed tomography (CBCT) not only plays an important role in image-guided radiation therapy (IGRT) but also has the potential for dose calculation. Because CBCT suffers from poor image quality and uncertainties in the Hounsfield unit (HU) values, the accuracy of dose calc...

Descripción completa

Detalles Bibliográficos
Autores principales: Suwanraksa, Chitchaya, Bridhikitti, Jidapa, Liamsuwan, Thiansin, Chaichulee, Sitthichok
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10093508/
https://www.ncbi.nlm.nih.gov/pubmed/37046678
http://dx.doi.org/10.3390/cancers15072017
_version_ 1785023603141509120
author Suwanraksa, Chitchaya
Bridhikitti, Jidapa
Liamsuwan, Thiansin
Chaichulee, Sitthichok
author_facet Suwanraksa, Chitchaya
Bridhikitti, Jidapa
Liamsuwan, Thiansin
Chaichulee, Sitthichok
author_sort Suwanraksa, Chitchaya
collection PubMed
description SIMPLE SUMMARY: Cone-beam computed tomography (CBCT) not only plays an important role in image-guided radiation therapy (IGRT) but also has the potential for dose calculation. Because CBCT suffers from poor image quality and uncertainties in the Hounsfield unit (HU) values, the accuracy of dose calculation with CBCT is insufficient for clinical use. This study investigated deep learning approaches that utilize a generative adversarial network (GAN) with an additional registration network (RegNet) to generate synthetic CT (sCT) from CBCT. Our study addressed the limitation of having paired CT and CBCT with their anatomy perfectly aligned for supervised training. RegNet can dynamically estimate the correct labels, enabling supervised learning with noisy labels, whereas GAN learns the bidirectional mapping from CBCT to CT. The HU values for sCT were sufficiently accurate for dose calculation, while preserving the anatomy of CBCT with clear structural boundaries. ABSTRACT: Recently, deep learning with generative adversarial networks (GANs) has been applied in multi-domain image-to-image translation. This study aims to improve the image quality of cone-beam computed tomography (CBCT) by generating synthetic CT (sCT) that maintains the patient’s anatomy as in CBCT, while having the image quality of CT. As CBCT and CT are acquired at different time points, it is challenging to obtain paired images with aligned anatomy for supervised training. To address this limitation, the study incorporated a registration network (RegNet) into GAN during training. RegNet can dynamically estimate the correct labels, allowing supervised learning with noisy labels. The study developed and evaluated the approach using imaging data from 146 patients with head and neck cancer. The results showed that GAN trained with RegNet performed better than those trained without RegNet. Specifically, in the UNIT model trained with RegNet, the mean absolute error (MAE) was reduced from 40.46 to 37.21, the root mean-square error (RMSE) was reduced from 119.45 to 108.86, the peak signal-to-noise ratio (PSNR) was increased from 28.67 to 29.55, and the structural similarity index (SSIM) was increased from 0.8630 to 0.8791. The sCT generated from the model had fewer artifacts and retained the anatomical information as in CBCT.
format Online
Article
Text
id pubmed-10093508
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-100935082023-04-13 CBCT-to-CT Translation Using Registration-Based Generative Adversarial Networks in Patients with Head and Neck Cancer Suwanraksa, Chitchaya Bridhikitti, Jidapa Liamsuwan, Thiansin Chaichulee, Sitthichok Cancers (Basel) Article SIMPLE SUMMARY: Cone-beam computed tomography (CBCT) not only plays an important role in image-guided radiation therapy (IGRT) but also has the potential for dose calculation. Because CBCT suffers from poor image quality and uncertainties in the Hounsfield unit (HU) values, the accuracy of dose calculation with CBCT is insufficient for clinical use. This study investigated deep learning approaches that utilize a generative adversarial network (GAN) with an additional registration network (RegNet) to generate synthetic CT (sCT) from CBCT. Our study addressed the limitation of having paired CT and CBCT with their anatomy perfectly aligned for supervised training. RegNet can dynamically estimate the correct labels, enabling supervised learning with noisy labels, whereas GAN learns the bidirectional mapping from CBCT to CT. The HU values for sCT were sufficiently accurate for dose calculation, while preserving the anatomy of CBCT with clear structural boundaries. ABSTRACT: Recently, deep learning with generative adversarial networks (GANs) has been applied in multi-domain image-to-image translation. This study aims to improve the image quality of cone-beam computed tomography (CBCT) by generating synthetic CT (sCT) that maintains the patient’s anatomy as in CBCT, while having the image quality of CT. As CBCT and CT are acquired at different time points, it is challenging to obtain paired images with aligned anatomy for supervised training. To address this limitation, the study incorporated a registration network (RegNet) into GAN during training. RegNet can dynamically estimate the correct labels, allowing supervised learning with noisy labels. The study developed and evaluated the approach using imaging data from 146 patients with head and neck cancer. The results showed that GAN trained with RegNet performed better than those trained without RegNet. Specifically, in the UNIT model trained with RegNet, the mean absolute error (MAE) was reduced from 40.46 to 37.21, the root mean-square error (RMSE) was reduced from 119.45 to 108.86, the peak signal-to-noise ratio (PSNR) was increased from 28.67 to 29.55, and the structural similarity index (SSIM) was increased from 0.8630 to 0.8791. The sCT generated from the model had fewer artifacts and retained the anatomical information as in CBCT. MDPI 2023-03-28 /pmc/articles/PMC10093508/ /pubmed/37046678 http://dx.doi.org/10.3390/cancers15072017 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Suwanraksa, Chitchaya
Bridhikitti, Jidapa
Liamsuwan, Thiansin
Chaichulee, Sitthichok
CBCT-to-CT Translation Using Registration-Based Generative Adversarial Networks in Patients with Head and Neck Cancer
title CBCT-to-CT Translation Using Registration-Based Generative Adversarial Networks in Patients with Head and Neck Cancer
title_full CBCT-to-CT Translation Using Registration-Based Generative Adversarial Networks in Patients with Head and Neck Cancer
title_fullStr CBCT-to-CT Translation Using Registration-Based Generative Adversarial Networks in Patients with Head and Neck Cancer
title_full_unstemmed CBCT-to-CT Translation Using Registration-Based Generative Adversarial Networks in Patients with Head and Neck Cancer
title_short CBCT-to-CT Translation Using Registration-Based Generative Adversarial Networks in Patients with Head and Neck Cancer
title_sort cbct-to-ct translation using registration-based generative adversarial networks in patients with head and neck cancer
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10093508/
https://www.ncbi.nlm.nih.gov/pubmed/37046678
http://dx.doi.org/10.3390/cancers15072017
work_keys_str_mv AT suwanraksachitchaya cbcttocttranslationusingregistrationbasedgenerativeadversarialnetworksinpatientswithheadandneckcancer
AT bridhikittijidapa cbcttocttranslationusingregistrationbasedgenerativeadversarialnetworksinpatientswithheadandneckcancer
AT liamsuwanthiansin cbcttocttranslationusingregistrationbasedgenerativeadversarialnetworksinpatientswithheadandneckcancer
AT chaichuleesitthichok cbcttocttranslationusingregistrationbasedgenerativeadversarialnetworksinpatientswithheadandneckcancer