Cargando…

An evaluation of quality assurance guidelines comparing the American College of Radiology and American Association of Physicists in Medicine task group 284 for magnetic resonance simulation

PURPOSE: The purpose of this study was to evaluate similarities and differences in quality assurance (QA) guidelines for a conventional diagnostic magnetic resonance (MR) system and a MR simulator (MR‐SIM) system used for radiotherapy. METHODS: In this study, we compared QA testing guidelines from t...

Descripción completa

Detalles Bibliográficos
Autores principales: Buatti, Jacob S., Gallagher, Kyle J., Bailey, Isaac, Griglock, Thomas, Heard, Malcolm
Formato: Online Artículo Texto
Lenguaje:English
Publicado: John Wiley and Sons Inc. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9359023/
https://www.ncbi.nlm.nih.gov/pubmed/35851720
http://dx.doi.org/10.1002/acm2.13730
_version_ 1784764051363987456
author Buatti, Jacob S.
Gallagher, Kyle J.
Bailey, Isaac
Griglock, Thomas
Heard, Malcolm
author_facet Buatti, Jacob S.
Gallagher, Kyle J.
Bailey, Isaac
Griglock, Thomas
Heard, Malcolm
author_sort Buatti, Jacob S.
collection PubMed
description PURPOSE: The purpose of this study was to evaluate similarities and differences in quality assurance (QA) guidelines for a conventional diagnostic magnetic resonance (MR) system and a MR simulator (MR‐SIM) system used for radiotherapy. METHODS: In this study, we compared QA testing guidelines from the American College of Radiology (ACR) MR Quality Control (MR QC) Manual to the QA section of the American Association of Physicists in Medicine (AAPM) Task Group 284 report (TG‐284). Differences and similarities were identified in testing scope, frequency, and tolerances. QA testing results from an ACR accredited clinical diagnostic MR system following ACR MR QC instructions were then evaluated using TG‐284 tolerances. RESULTS: Five tests from the ACR MR QC Manual were not included in TG‐284. Five new tests were identified for MR‐SIM systems in TG‐284 and pertained exclusively to the external laser positioning system of MR‐SIM systems. “Low‐contrast object detectability” (LCD), “table motion smoothness and accuracy,” “transmitter gain,” and “geometric accuracy” tests differed between the two QA guides. Tighter tolerances were required in TG‐284 for “table motion smoothness and accuracy” and “low contrast object detectability.” “Transmitter gain” tolerance was dependent on initial baseline measurements, and TG‐284 required that geometric accuracy be tested over a larger field of view than the ACR testing method. All tests from the ACR MR QC Manual for a conventional MR system passed ACR tolerances. The T2‐weighted image acquired with ACR sequences failed the 40‐spoke requirement from TG‐284, transmitter gain was at the 5% tolerance of TG‐284, and geometric accuracy could not be evaluated because of required equipment differences. Table motion passed both TG‐284 and ACR required tolerances. CONCLUSION: Our study evaluated QA guidelines for an MR‐SIM and demonstrated the additional QA requirements of a clinical diagnostic MR system to be used as an MR‐SIM in radiotherapy as recommended by TG‐284.
format Online
Article
Text
id pubmed-9359023
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher John Wiley and Sons Inc.
record_format MEDLINE/PubMed
spelling pubmed-93590232022-08-10 An evaluation of quality assurance guidelines comparing the American College of Radiology and American Association of Physicists in Medicine task group 284 for magnetic resonance simulation Buatti, Jacob S. Gallagher, Kyle J. Bailey, Isaac Griglock, Thomas Heard, Malcolm J Appl Clin Med Phys Radiation Oncology Physics PURPOSE: The purpose of this study was to evaluate similarities and differences in quality assurance (QA) guidelines for a conventional diagnostic magnetic resonance (MR) system and a MR simulator (MR‐SIM) system used for radiotherapy. METHODS: In this study, we compared QA testing guidelines from the American College of Radiology (ACR) MR Quality Control (MR QC) Manual to the QA section of the American Association of Physicists in Medicine (AAPM) Task Group 284 report (TG‐284). Differences and similarities were identified in testing scope, frequency, and tolerances. QA testing results from an ACR accredited clinical diagnostic MR system following ACR MR QC instructions were then evaluated using TG‐284 tolerances. RESULTS: Five tests from the ACR MR QC Manual were not included in TG‐284. Five new tests were identified for MR‐SIM systems in TG‐284 and pertained exclusively to the external laser positioning system of MR‐SIM systems. “Low‐contrast object detectability” (LCD), “table motion smoothness and accuracy,” “transmitter gain,” and “geometric accuracy” tests differed between the two QA guides. Tighter tolerances were required in TG‐284 for “table motion smoothness and accuracy” and “low contrast object detectability.” “Transmitter gain” tolerance was dependent on initial baseline measurements, and TG‐284 required that geometric accuracy be tested over a larger field of view than the ACR testing method. All tests from the ACR MR QC Manual for a conventional MR system passed ACR tolerances. The T2‐weighted image acquired with ACR sequences failed the 40‐spoke requirement from TG‐284, transmitter gain was at the 5% tolerance of TG‐284, and geometric accuracy could not be evaluated because of required equipment differences. Table motion passed both TG‐284 and ACR required tolerances. CONCLUSION: Our study evaluated QA guidelines for an MR‐SIM and demonstrated the additional QA requirements of a clinical diagnostic MR system to be used as an MR‐SIM in radiotherapy as recommended by TG‐284. John Wiley and Sons Inc. 2022-07-18 /pmc/articles/PMC9359023/ /pubmed/35851720 http://dx.doi.org/10.1002/acm2.13730 Text en © 2022 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, LLC on behalf of The American Association of Physicists in Medicine. https://creativecommons.org/licenses/by/4.0/This is an open access article under the terms of the http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.
spellingShingle Radiation Oncology Physics
Buatti, Jacob S.
Gallagher, Kyle J.
Bailey, Isaac
Griglock, Thomas
Heard, Malcolm
An evaluation of quality assurance guidelines comparing the American College of Radiology and American Association of Physicists in Medicine task group 284 for magnetic resonance simulation
title An evaluation of quality assurance guidelines comparing the American College of Radiology and American Association of Physicists in Medicine task group 284 for magnetic resonance simulation
title_full An evaluation of quality assurance guidelines comparing the American College of Radiology and American Association of Physicists in Medicine task group 284 for magnetic resonance simulation
title_fullStr An evaluation of quality assurance guidelines comparing the American College of Radiology and American Association of Physicists in Medicine task group 284 for magnetic resonance simulation
title_full_unstemmed An evaluation of quality assurance guidelines comparing the American College of Radiology and American Association of Physicists in Medicine task group 284 for magnetic resonance simulation
title_short An evaluation of quality assurance guidelines comparing the American College of Radiology and American Association of Physicists in Medicine task group 284 for magnetic resonance simulation
title_sort evaluation of quality assurance guidelines comparing the american college of radiology and american association of physicists in medicine task group 284 for magnetic resonance simulation
topic Radiation Oncology Physics
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9359023/
https://www.ncbi.nlm.nih.gov/pubmed/35851720
http://dx.doi.org/10.1002/acm2.13730
work_keys_str_mv AT buattijacobs anevaluationofqualityassuranceguidelinescomparingtheamericancollegeofradiologyandamericanassociationofphysicistsinmedicinetaskgroup284formagneticresonancesimulation
AT gallagherkylej anevaluationofqualityassuranceguidelinescomparingtheamericancollegeofradiologyandamericanassociationofphysicistsinmedicinetaskgroup284formagneticresonancesimulation
AT baileyisaac anevaluationofqualityassuranceguidelinescomparingtheamericancollegeofradiologyandamericanassociationofphysicistsinmedicinetaskgroup284formagneticresonancesimulation
AT griglockthomas anevaluationofqualityassuranceguidelinescomparingtheamericancollegeofradiologyandamericanassociationofphysicistsinmedicinetaskgroup284formagneticresonancesimulation
AT heardmalcolm anevaluationofqualityassuranceguidelinescomparingtheamericancollegeofradiologyandamericanassociationofphysicistsinmedicinetaskgroup284formagneticresonancesimulation
AT buattijacobs evaluationofqualityassuranceguidelinescomparingtheamericancollegeofradiologyandamericanassociationofphysicistsinmedicinetaskgroup284formagneticresonancesimulation
AT gallagherkylej evaluationofqualityassuranceguidelinescomparingtheamericancollegeofradiologyandamericanassociationofphysicistsinmedicinetaskgroup284formagneticresonancesimulation
AT baileyisaac evaluationofqualityassuranceguidelinescomparingtheamericancollegeofradiologyandamericanassociationofphysicistsinmedicinetaskgroup284formagneticresonancesimulation
AT griglockthomas evaluationofqualityassuranceguidelinescomparingtheamericancollegeofradiologyandamericanassociationofphysicistsinmedicinetaskgroup284formagneticresonancesimulation
AT heardmalcolm evaluationofqualityassuranceguidelinescomparingtheamericancollegeofradiologyandamericanassociationofphysicistsinmedicinetaskgroup284formagneticresonancesimulation