Cargando…

Calibration of the EBT3 Gafchromic Film Using HNN Deep Learning

To achieve a dose distribution conformal to the target volume while sparing normal tissues, intensity modulation with steep dose gradient is used for treatment planning. To successfully deliver such treatment, high spatial and dosimetric accuracy are crucial and need to be verified. With high 2D dos...

Descripción completa

Detalles Bibliográficos
Autores principales: Chang, Liyun, Yeh, Shyh-An, Ho, Sheng-Yow, Ding, Hueisch-Jy, Chen, Pang-Yu, Lee, Tsair-Fwu
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Hindawi 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7892216/
https://www.ncbi.nlm.nih.gov/pubmed/33628820
http://dx.doi.org/10.1155/2021/8838401
_version_ 1783652800290357248
author Chang, Liyun
Yeh, Shyh-An
Ho, Sheng-Yow
Ding, Hueisch-Jy
Chen, Pang-Yu
Lee, Tsair-Fwu
author_facet Chang, Liyun
Yeh, Shyh-An
Ho, Sheng-Yow
Ding, Hueisch-Jy
Chen, Pang-Yu
Lee, Tsair-Fwu
author_sort Chang, Liyun
collection PubMed
description To achieve a dose distribution conformal to the target volume while sparing normal tissues, intensity modulation with steep dose gradient is used for treatment planning. To successfully deliver such treatment, high spatial and dosimetric accuracy are crucial and need to be verified. With high 2D dosimetry resolution and a self-development property, the Ashland Inc. product EBT3 Gafchromic film is a widely used quality assurance tool designed especially for this. However, the film should be recalibrated each quarter due to the “aging effect,” and calibration uncertainties always exist between individual films even in the same lot. Recently, artificial neural networks (ANN) are applied to many fields. If a physicist can collect the calibration data, it could be accumulated to be a substantial ANN data input used for film calibration. We therefore use the Keras functional Application Program Interface to build a hierarchical neural network (HNN), with the inputs of net optical densities, pixel values, and inverse transmittances to reveal the delivered dose and train the neural network with deep learning. For comparison, the film dose calculated using red-channel net optical density with power function fitting was performed and taken as a conventional method. The results show that the percentage error of the film dose using the HNN method is less than 4% for the aging effect verification test and less than 4.5% for the intralot variation test; in contrast, the conventional method could yield errors higher than 10% and 7%, respectively. This HNN method to calibrate the EBT film could be further improved by adding training data or adjusting the HNN structure. The model could help physicists spend less calibration time and reduce film usage.
format Online
Article
Text
id pubmed-7892216
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Hindawi
record_format MEDLINE/PubMed
spelling pubmed-78922162021-02-23 Calibration of the EBT3 Gafchromic Film Using HNN Deep Learning Chang, Liyun Yeh, Shyh-An Ho, Sheng-Yow Ding, Hueisch-Jy Chen, Pang-Yu Lee, Tsair-Fwu Biomed Res Int Research Article To achieve a dose distribution conformal to the target volume while sparing normal tissues, intensity modulation with steep dose gradient is used for treatment planning. To successfully deliver such treatment, high spatial and dosimetric accuracy are crucial and need to be verified. With high 2D dosimetry resolution and a self-development property, the Ashland Inc. product EBT3 Gafchromic film is a widely used quality assurance tool designed especially for this. However, the film should be recalibrated each quarter due to the “aging effect,” and calibration uncertainties always exist between individual films even in the same lot. Recently, artificial neural networks (ANN) are applied to many fields. If a physicist can collect the calibration data, it could be accumulated to be a substantial ANN data input used for film calibration. We therefore use the Keras functional Application Program Interface to build a hierarchical neural network (HNN), with the inputs of net optical densities, pixel values, and inverse transmittances to reveal the delivered dose and train the neural network with deep learning. For comparison, the film dose calculated using red-channel net optical density with power function fitting was performed and taken as a conventional method. The results show that the percentage error of the film dose using the HNN method is less than 4% for the aging effect verification test and less than 4.5% for the intralot variation test; in contrast, the conventional method could yield errors higher than 10% and 7%, respectively. This HNN method to calibrate the EBT film could be further improved by adding training data or adjusting the HNN structure. The model could help physicists spend less calibration time and reduce film usage. Hindawi 2021-01-31 /pmc/articles/PMC7892216/ /pubmed/33628820 http://dx.doi.org/10.1155/2021/8838401 Text en Copyright © 2021 Liyun Chang et al. https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Research Article
Chang, Liyun
Yeh, Shyh-An
Ho, Sheng-Yow
Ding, Hueisch-Jy
Chen, Pang-Yu
Lee, Tsair-Fwu
Calibration of the EBT3 Gafchromic Film Using HNN Deep Learning
title Calibration of the EBT3 Gafchromic Film Using HNN Deep Learning
title_full Calibration of the EBT3 Gafchromic Film Using HNN Deep Learning
title_fullStr Calibration of the EBT3 Gafchromic Film Using HNN Deep Learning
title_full_unstemmed Calibration of the EBT3 Gafchromic Film Using HNN Deep Learning
title_short Calibration of the EBT3 Gafchromic Film Using HNN Deep Learning
title_sort calibration of the ebt3 gafchromic film using hnn deep learning
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7892216/
https://www.ncbi.nlm.nih.gov/pubmed/33628820
http://dx.doi.org/10.1155/2021/8838401
work_keys_str_mv AT changliyun calibrationoftheebt3gafchromicfilmusinghnndeeplearning
AT yehshyhan calibrationoftheebt3gafchromicfilmusinghnndeeplearning
AT hoshengyow calibrationoftheebt3gafchromicfilmusinghnndeeplearning
AT dinghueischjy calibrationoftheebt3gafchromicfilmusinghnndeeplearning
AT chenpangyu calibrationoftheebt3gafchromicfilmusinghnndeeplearning
AT leetsairfwu calibrationoftheebt3gafchromicfilmusinghnndeeplearning