Cargando…

Variability in commercially available deformable image registration: A multi‐institution analysis using virtual head and neck phantoms

PURPOSE: The purpose of this study was to evaluate the performance of three common deformable image registration (DIR) packages across algorithms and institutions. METHODS AND MATERIALS: The Deformable Image Registration Evaluation Project (DIREP) provides ten virtual phantoms derived from computed...

Descripción completa

Detalles Bibliográficos
Autores principales: Kubli, Alex, Pukala, Jason, Shah, Amish P., Kelly, Patrick, Langen, Katja M., Bova, Frank J., Mañon, Rafael R., Meeks, Sanford L.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: John Wiley and Sons Inc. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8130225/
https://www.ncbi.nlm.nih.gov/pubmed/33783960
http://dx.doi.org/10.1002/acm2.13242
_version_ 1783694470411190272
author Kubli, Alex
Pukala, Jason
Shah, Amish P.
Kelly, Patrick
Langen, Katja M.
Bova, Frank J.
Mañon, Rafael R.
Meeks, Sanford L.
author_facet Kubli, Alex
Pukala, Jason
Shah, Amish P.
Kelly, Patrick
Langen, Katja M.
Bova, Frank J.
Mañon, Rafael R.
Meeks, Sanford L.
author_sort Kubli, Alex
collection PubMed
description PURPOSE: The purpose of this study was to evaluate the performance of three common deformable image registration (DIR) packages across algorithms and institutions. METHODS AND MATERIALS: The Deformable Image Registration Evaluation Project (DIREP) provides ten virtual phantoms derived from computed tomography (CT) datasets of head‐and‐neck cancer patients over a single treatment course. Using the DIREP phantoms, DIR results from 35 institutions were submitted using either Velocity, MIM, or Eclipse. Submitted deformation vector fields (DVFs) were compared to ground‐truth DVFs to calculate target registration error (TRE) for six regions of interest (ROIs). Statistical analysis was performed to determine the variability between each DIR software package and the variability of users within each algorithm. RESULTS: Overall mean TRE was 2.04 ± 0.35 mm for Velocity, 1.10 ± 0.29 mm for MIM, and 2.35 ± 0.15 mm for Eclipse. The MIM mean TRE was significantly different than both Velocity and Eclipse for all ROIs. Velocity and Eclipse mean TREs were not significantly different except for when evaluating the registration of the cord or mandible. Significant differences between institutions were found for the MIM and Velocity platforms. However, these differences could be explained by variations in Velocity DIR parameters and MIM software versions. CONCLUSIONS: Average TRE was shown to be <3 mm for all three software platforms. However, maximum errors could be larger than 2 cm indicating that care should be exercised when using DIR. While MIM performed statistically better than the other packages, all evaluated algorithms had an average TRE better than the largest voxel dimension. For the phantoms studied here, significant differences between algorithm users were minimal suggesting that the algorithm used may have more impact on DIR accuracy than the particular registration technique employed. A significant difference in TRE was discovered between MIM versions showing that DIR QA should be performed after software upgrades as recommended by TG‐132.
format Online
Article
Text
id pubmed-8130225
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher John Wiley and Sons Inc.
record_format MEDLINE/PubMed
spelling pubmed-81302252021-05-21 Variability in commercially available deformable image registration: A multi‐institution analysis using virtual head and neck phantoms Kubli, Alex Pukala, Jason Shah, Amish P. Kelly, Patrick Langen, Katja M. Bova, Frank J. Mañon, Rafael R. Meeks, Sanford L. J Appl Clin Med Phys Radiation Oncology Physics PURPOSE: The purpose of this study was to evaluate the performance of three common deformable image registration (DIR) packages across algorithms and institutions. METHODS AND MATERIALS: The Deformable Image Registration Evaluation Project (DIREP) provides ten virtual phantoms derived from computed tomography (CT) datasets of head‐and‐neck cancer patients over a single treatment course. Using the DIREP phantoms, DIR results from 35 institutions were submitted using either Velocity, MIM, or Eclipse. Submitted deformation vector fields (DVFs) were compared to ground‐truth DVFs to calculate target registration error (TRE) for six regions of interest (ROIs). Statistical analysis was performed to determine the variability between each DIR software package and the variability of users within each algorithm. RESULTS: Overall mean TRE was 2.04 ± 0.35 mm for Velocity, 1.10 ± 0.29 mm for MIM, and 2.35 ± 0.15 mm for Eclipse. The MIM mean TRE was significantly different than both Velocity and Eclipse for all ROIs. Velocity and Eclipse mean TREs were not significantly different except for when evaluating the registration of the cord or mandible. Significant differences between institutions were found for the MIM and Velocity platforms. However, these differences could be explained by variations in Velocity DIR parameters and MIM software versions. CONCLUSIONS: Average TRE was shown to be <3 mm for all three software platforms. However, maximum errors could be larger than 2 cm indicating that care should be exercised when using DIR. While MIM performed statistically better than the other packages, all evaluated algorithms had an average TRE better than the largest voxel dimension. For the phantoms studied here, significant differences between algorithm users were minimal suggesting that the algorithm used may have more impact on DIR accuracy than the particular registration technique employed. A significant difference in TRE was discovered between MIM versions showing that DIR QA should be performed after software upgrades as recommended by TG‐132. John Wiley and Sons Inc. 2021-03-30 /pmc/articles/PMC8130225/ /pubmed/33783960 http://dx.doi.org/10.1002/acm2.13242 Text en © 2021 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals LLC on behalf of American Association of Physicists in Medicine. https://creativecommons.org/licenses/by/4.0/This is an open access article under the terms of the http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.
spellingShingle Radiation Oncology Physics
Kubli, Alex
Pukala, Jason
Shah, Amish P.
Kelly, Patrick
Langen, Katja M.
Bova, Frank J.
Mañon, Rafael R.
Meeks, Sanford L.
Variability in commercially available deformable image registration: A multi‐institution analysis using virtual head and neck phantoms
title Variability in commercially available deformable image registration: A multi‐institution analysis using virtual head and neck phantoms
title_full Variability in commercially available deformable image registration: A multi‐institution analysis using virtual head and neck phantoms
title_fullStr Variability in commercially available deformable image registration: A multi‐institution analysis using virtual head and neck phantoms
title_full_unstemmed Variability in commercially available deformable image registration: A multi‐institution analysis using virtual head and neck phantoms
title_short Variability in commercially available deformable image registration: A multi‐institution analysis using virtual head and neck phantoms
title_sort variability in commercially available deformable image registration: a multi‐institution analysis using virtual head and neck phantoms
topic Radiation Oncology Physics
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8130225/
https://www.ncbi.nlm.nih.gov/pubmed/33783960
http://dx.doi.org/10.1002/acm2.13242
work_keys_str_mv AT kublialex variabilityincommerciallyavailabledeformableimageregistrationamultiinstitutionanalysisusingvirtualheadandneckphantoms
AT pukalajason variabilityincommerciallyavailabledeformableimageregistrationamultiinstitutionanalysisusingvirtualheadandneckphantoms
AT shahamishp variabilityincommerciallyavailabledeformableimageregistrationamultiinstitutionanalysisusingvirtualheadandneckphantoms
AT kellypatrick variabilityincommerciallyavailabledeformableimageregistrationamultiinstitutionanalysisusingvirtualheadandneckphantoms
AT langenkatjam variabilityincommerciallyavailabledeformableimageregistrationamultiinstitutionanalysisusingvirtualheadandneckphantoms
AT bovafrankj variabilityincommerciallyavailabledeformableimageregistrationamultiinstitutionanalysisusingvirtualheadandneckphantoms
AT manonrafaelr variabilityincommerciallyavailabledeformableimageregistrationamultiinstitutionanalysisusingvirtualheadandneckphantoms
AT meekssanfordl variabilityincommerciallyavailabledeformableimageregistrationamultiinstitutionanalysisusingvirtualheadandneckphantoms