Cargando…

Performance variations among clinically available deformable image registration tools in adaptive radiotherapy — how should we evaluate and interpret the result?

The purpose of this study is to evaluate the performance variations in commercial deformable image registration (DIR) tools for adaptive radiation therapy and further to interpret the differences using clinically available terms. Three clinical examples (prostate, head and neck (HN), and cranial spi...

Descripción completa

Detalles Bibliográficos
Autores principales: Nie, Ke, Pouliot, Jean, Smith, Eric, Chuang, Cynthia
Formato: Online Artículo Texto
Lenguaje:English
Publicado: John Wiley and Sons Inc. 2016
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5874855/
https://www.ncbi.nlm.nih.gov/pubmed/27074457
http://dx.doi.org/10.1120/jacmp.v17i2.5778
_version_ 1783310246461046784
author Nie, Ke
Pouliot, Jean
Smith, Eric
Chuang, Cynthia
author_facet Nie, Ke
Pouliot, Jean
Smith, Eric
Chuang, Cynthia
author_sort Nie, Ke
collection PubMed
description The purpose of this study is to evaluate the performance variations in commercial deformable image registration (DIR) tools for adaptive radiation therapy and further to interpret the differences using clinically available terms. Three clinical examples (prostate, head and neck (HN), and cranial spinal irradiation (CSI) with L‐spine boost) were evaluated in this study. Firstly, computerized deformed CT images were generated using simulation QA software with virtual deformations of bladder filling (prostate), neck flexion/bite‐block repositioning/tumor shrinkage (HN), and vertebral body rotation (CSI). The corresponding transformation matrices served as a “reference” for the following comparisons. Three commercialized DIR algorithms: the free‐form deformation from MIMVista 5.5 and the RegRefine from MIMMaestro 6.0, the multipass B‐spline from VelocityAI v3.0.1, and the adaptive demons from OnQ rts 2.1.15, were applied between the initial images and the deformed CT sets. The generated adaptive contours and dose distributions were compared with the “reference” and among each other. The performance in transferring contours was comparable among all three tools with an average Dice similarity coefficient of 0.81 for all the organs. However, the dose warping accuracy appeared to rely on the evaluation end points and methodologies. Point‐dose differences could show a difference of up to 23.3 Gy inside the PTVs and to overestimate up to 13.2 Gy for OARs, which was substantial for a 72 Gy prescription dose. Dosevolume histogram‐based evaluation might not be sensitive enough to illustrate all the detailed variations, while isodose assessment on a slice‐by‐slice basis could be tedious. We further explored the possibility of using 3D gamma index analysis for warping dose variation assessment, and observed differences in dose warping using different DIR tools. Overall, our results demonstrated that evaluation based only on the performance of contour transformation could not guarantee the accuracy in dose warping, while dose‐transferring validation strongly relied on the evaluation endpoint. As dose‐transferring errors could cause misinterpretations when attempting to accumulate dose for adaptive radiation therapy and more DIR tools are available for clinical use, a standard and clinically meaningful quality assurance criterion should be established for DIR QA in the near future. PACS number(s): 87.57
format Online
Article
Text
id pubmed-5874855
institution National Center for Biotechnology Information
language English
publishDate 2016
publisher John Wiley and Sons Inc.
record_format MEDLINE/PubMed
spelling pubmed-58748552018-04-02 Performance variations among clinically available deformable image registration tools in adaptive radiotherapy — how should we evaluate and interpret the result? Nie, Ke Pouliot, Jean Smith, Eric Chuang, Cynthia J Appl Clin Med Phys Radiation Oncology Physics The purpose of this study is to evaluate the performance variations in commercial deformable image registration (DIR) tools for adaptive radiation therapy and further to interpret the differences using clinically available terms. Three clinical examples (prostate, head and neck (HN), and cranial spinal irradiation (CSI) with L‐spine boost) were evaluated in this study. Firstly, computerized deformed CT images were generated using simulation QA software with virtual deformations of bladder filling (prostate), neck flexion/bite‐block repositioning/tumor shrinkage (HN), and vertebral body rotation (CSI). The corresponding transformation matrices served as a “reference” for the following comparisons. Three commercialized DIR algorithms: the free‐form deformation from MIMVista 5.5 and the RegRefine from MIMMaestro 6.0, the multipass B‐spline from VelocityAI v3.0.1, and the adaptive demons from OnQ rts 2.1.15, were applied between the initial images and the deformed CT sets. The generated adaptive contours and dose distributions were compared with the “reference” and among each other. The performance in transferring contours was comparable among all three tools with an average Dice similarity coefficient of 0.81 for all the organs. However, the dose warping accuracy appeared to rely on the evaluation end points and methodologies. Point‐dose differences could show a difference of up to 23.3 Gy inside the PTVs and to overestimate up to 13.2 Gy for OARs, which was substantial for a 72 Gy prescription dose. Dosevolume histogram‐based evaluation might not be sensitive enough to illustrate all the detailed variations, while isodose assessment on a slice‐by‐slice basis could be tedious. We further explored the possibility of using 3D gamma index analysis for warping dose variation assessment, and observed differences in dose warping using different DIR tools. Overall, our results demonstrated that evaluation based only on the performance of contour transformation could not guarantee the accuracy in dose warping, while dose‐transferring validation strongly relied on the evaluation endpoint. As dose‐transferring errors could cause misinterpretations when attempting to accumulate dose for adaptive radiation therapy and more DIR tools are available for clinical use, a standard and clinically meaningful quality assurance criterion should be established for DIR QA in the near future. PACS number(s): 87.57 John Wiley and Sons Inc. 2016-03-08 /pmc/articles/PMC5874855/ /pubmed/27074457 http://dx.doi.org/10.1120/jacmp.v17i2.5778 Text en © 2016 The Authors. This is an open access article under the terms of the http://creativecommons.org/licenses/by/3.0/ License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.
spellingShingle Radiation Oncology Physics
Nie, Ke
Pouliot, Jean
Smith, Eric
Chuang, Cynthia
Performance variations among clinically available deformable image registration tools in adaptive radiotherapy — how should we evaluate and interpret the result?
title Performance variations among clinically available deformable image registration tools in adaptive radiotherapy — how should we evaluate and interpret the result?
title_full Performance variations among clinically available deformable image registration tools in adaptive radiotherapy — how should we evaluate and interpret the result?
title_fullStr Performance variations among clinically available deformable image registration tools in adaptive radiotherapy — how should we evaluate and interpret the result?
title_full_unstemmed Performance variations among clinically available deformable image registration tools in adaptive radiotherapy — how should we evaluate and interpret the result?
title_short Performance variations among clinically available deformable image registration tools in adaptive radiotherapy — how should we evaluate and interpret the result?
title_sort performance variations among clinically available deformable image registration tools in adaptive radiotherapy — how should we evaluate and interpret the result?
topic Radiation Oncology Physics
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5874855/
https://www.ncbi.nlm.nih.gov/pubmed/27074457
http://dx.doi.org/10.1120/jacmp.v17i2.5778
work_keys_str_mv AT nieke performancevariationsamongclinicallyavailabledeformableimageregistrationtoolsinadaptiveradiotherapyhowshouldweevaluateandinterprettheresult
AT pouliotjean performancevariationsamongclinicallyavailabledeformableimageregistrationtoolsinadaptiveradiotherapyhowshouldweevaluateandinterprettheresult
AT smitheric performancevariationsamongclinicallyavailabledeformableimageregistrationtoolsinadaptiveradiotherapyhowshouldweevaluateandinterprettheresult
AT chuangcynthia performancevariationsamongclinicallyavailabledeformableimageregistrationtoolsinadaptiveradiotherapyhowshouldweevaluateandinterprettheresult