Cargando…

Evaluation of a deformable image registration quality assurance tool for head and neck cancer patients

INTRODUCTION: A challenge in implementing deformable image registration (DIR) in radiation therapy planning is effectively communicating registration accuracy to the radiation oncologist. This study aimed to evaluate the MIM® quality assurance (QA) tool for rating DIR accuracy. METHODS: Retrospectiv...

Descripción completa

Detalles Bibliográficos
Autores principales: Mee, Molly, Stewart, Kate, Lathouras, Marika, Truong, Helen, Hargrave, Catriona
Formato: Online Artículo Texto
Lenguaje:English
Publicado: John Wiley and Sons Inc. 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7754017/
https://www.ncbi.nlm.nih.gov/pubmed/33615738
http://dx.doi.org/10.1002/jmrs.428
_version_ 1783626108034351104
author Mee, Molly
Stewart, Kate
Lathouras, Marika
Truong, Helen
Hargrave, Catriona
author_facet Mee, Molly
Stewart, Kate
Lathouras, Marika
Truong, Helen
Hargrave, Catriona
author_sort Mee, Molly
collection PubMed
description INTRODUCTION: A challenge in implementing deformable image registration (DIR) in radiation therapy planning is effectively communicating registration accuracy to the radiation oncologist. This study aimed to evaluate the MIM® quality assurance (QA) tool for rating DIR accuracy. METHODS: Retrospective DIR was performed on CT images for 35 head and neck cancer patients. The QA tool was used to rate DIR accuracy as good, fair or bad. Thirty registered patient images were assessed independently by three RTs and a further five patients assessed by five RTs. Ratings were evaluated by comparison of Hausdorff Distance (HD), Mean Distance to Agreement (MDA), Dice Similarity Coefficients (DSC) and Jacobian determinants for parotid and mandible subregions on the two CTs post‐DIR. Inter‐operator reliability was assessed using Krippendorff's alpha coefficient (KALPA). Rating time and volume measures for each rating were also calculated. RESULTS: Quantitative metrics calculated for most anatomical subregions reflected the expected trend by registration accuracy, with good obtaining the most ideal values on average (HD = 7.50 ± 3.18, MDA = 0.64 ± 0.47, DSC = 0.90 ± 0.07, Jacobian = 0.95 ± 0.06). Highest inter‐operator reliability was observed for good ratings and within the parotids (KALPA 0.66–0.93), whilst ratings varied the most in regions of dental artefact. Overall, average rating time was 33 minutes and the least commonly applied rating by volume was fair. CONCLUSION: Results from qualitative and quantitative data, operator rating differences and rating time suggest highlighting only bad regions of DIR accuracy and implementing clinical guidelines and RT training for consistent and efficient use of the QA tool.
format Online
Article
Text
id pubmed-7754017
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher John Wiley and Sons Inc.
record_format MEDLINE/PubMed
spelling pubmed-77540172020-12-23 Evaluation of a deformable image registration quality assurance tool for head and neck cancer patients Mee, Molly Stewart, Kate Lathouras, Marika Truong, Helen Hargrave, Catriona J Med Radiat Sci Original Articles INTRODUCTION: A challenge in implementing deformable image registration (DIR) in radiation therapy planning is effectively communicating registration accuracy to the radiation oncologist. This study aimed to evaluate the MIM® quality assurance (QA) tool for rating DIR accuracy. METHODS: Retrospective DIR was performed on CT images for 35 head and neck cancer patients. The QA tool was used to rate DIR accuracy as good, fair or bad. Thirty registered patient images were assessed independently by three RTs and a further five patients assessed by five RTs. Ratings were evaluated by comparison of Hausdorff Distance (HD), Mean Distance to Agreement (MDA), Dice Similarity Coefficients (DSC) and Jacobian determinants for parotid and mandible subregions on the two CTs post‐DIR. Inter‐operator reliability was assessed using Krippendorff's alpha coefficient (KALPA). Rating time and volume measures for each rating were also calculated. RESULTS: Quantitative metrics calculated for most anatomical subregions reflected the expected trend by registration accuracy, with good obtaining the most ideal values on average (HD = 7.50 ± 3.18, MDA = 0.64 ± 0.47, DSC = 0.90 ± 0.07, Jacobian = 0.95 ± 0.06). Highest inter‐operator reliability was observed for good ratings and within the parotids (KALPA 0.66–0.93), whilst ratings varied the most in regions of dental artefact. Overall, average rating time was 33 minutes and the least commonly applied rating by volume was fair. CONCLUSION: Results from qualitative and quantitative data, operator rating differences and rating time suggest highlighting only bad regions of DIR accuracy and implementing clinical guidelines and RT training for consistent and efficient use of the QA tool. John Wiley and Sons Inc. 2020-12-08 2020-12 /pmc/articles/PMC7754017/ /pubmed/33615738 http://dx.doi.org/10.1002/jmrs.428 Text en © 2020 The Authors. Journal of Medical Radiation Sciences published by John Wiley & Sons Australia, Ltd on behalf of Australian Society of Medical Imaging and Radiation Therapy and New Zealand Institute of Medical Radiation Technology This is an open access article under the terms of the http://creativecommons.org/licenses/by-nc-nd/4.0/ License, which permits use and distribution in any medium, provided the original work is properly cited, the use is non‐commercial and no modifications or adaptations are made.
spellingShingle Original Articles
Mee, Molly
Stewart, Kate
Lathouras, Marika
Truong, Helen
Hargrave, Catriona
Evaluation of a deformable image registration quality assurance tool for head and neck cancer patients
title Evaluation of a deformable image registration quality assurance tool for head and neck cancer patients
title_full Evaluation of a deformable image registration quality assurance tool for head and neck cancer patients
title_fullStr Evaluation of a deformable image registration quality assurance tool for head and neck cancer patients
title_full_unstemmed Evaluation of a deformable image registration quality assurance tool for head and neck cancer patients
title_short Evaluation of a deformable image registration quality assurance tool for head and neck cancer patients
title_sort evaluation of a deformable image registration quality assurance tool for head and neck cancer patients
topic Original Articles
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7754017/
https://www.ncbi.nlm.nih.gov/pubmed/33615738
http://dx.doi.org/10.1002/jmrs.428
work_keys_str_mv AT meemolly evaluationofadeformableimageregistrationqualityassurancetoolforheadandneckcancerpatients
AT stewartkate evaluationofadeformableimageregistrationqualityassurancetoolforheadandneckcancerpatients
AT lathourasmarika evaluationofadeformableimageregistrationqualityassurancetoolforheadandneckcancerpatients
AT truonghelen evaluationofadeformableimageregistrationqualityassurancetoolforheadandneckcancerpatients
AT hargravecatriona evaluationofadeformableimageregistrationqualityassurancetoolforheadandneckcancerpatients