Cargando…

Medical physics 3.0 versus 1.0: A case study in digital radiography quality control

PURPOSE: The study illustrates how a renewed approach to medical physics, Medical Physics 3.0 (MP3.0), can identify performance decrement of digital radiography (DR) systems when conventional Medical Physics 1.0 (MP1.0) methods fail. METHODS: MP1.0 tests included traditional annual tests plus the ma...

Descripción completa

Detalles Bibliográficos
Autores principales: Carver, Diana E., Willis, Charles E., Stauduhar, Paul J., Nishino, Thomas K., Wells, Jered R., Samei, Ehsan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: John Wiley and Sons Inc. 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6123149/
https://www.ncbi.nlm.nih.gov/pubmed/30117273
http://dx.doi.org/10.1002/acm2.12425
_version_ 1783352799666372608
author Carver, Diana E.
Willis, Charles E.
Stauduhar, Paul J.
Nishino, Thomas K.
Wells, Jered R.
Samei, Ehsan
author_facet Carver, Diana E.
Willis, Charles E.
Stauduhar, Paul J.
Nishino, Thomas K.
Wells, Jered R.
Samei, Ehsan
author_sort Carver, Diana E.
collection PubMed
description PURPOSE: The study illustrates how a renewed approach to medical physics, Medical Physics 3.0 (MP3.0), can identify performance decrement of digital radiography (DR) systems when conventional Medical Physics 1.0 (MP1.0) methods fail. METHODS: MP1.0 tests included traditional annual tests plus the manufacturer's automated Quality Assurance Procedures (QAP) of a DR system before and after a radiologist's image quality (IQ) complaint repeated after service intervention. Further analysis was conducted using nontraditional MP3.0 tests including longitudinal review of QAP results from a 15‐yr database, exposure‐dependent signal‐to‐noise (SNR (2)), clinical IQ, and correlation with the institutional service database. Clinical images were analyzed in terms of IQ metrics by the Duke University Clinical Imaging Physics Group using previously validated software. RESULTS: Traditional metrics did not indicate discrepant system performance at any time. QAP reported a decrease in contrast‐to‐noise ratio (CNR) after detector replacement, but remained above the manufacturer's action limit. Clinical images showed increased lung noise (Ln), mediastinum noise (Mn), and subdiaphragm‐lung contrast (SLc), and decreased lung gray level (Lgl) following detector replacement. After detector recalibration, QAP CNR improved, but did not return to previous levels. Lgl and SLc no longer significantly differed from before detector recalibration; however, Ln and Mn remained significantly different. Exposure‐dependent SNR (2) documented the detector operating within acceptable limits 9 yr previously but subsequently becoming miscalibrated sometime before four prior annual tests. Service records revealed catastrophic failure of the computer containing the original detector calibration from 11 yr prior. It is likely that the incorrect calibration backup file was uploaded at that time. CONCLUSIONS: MP1.0 tests failed to detect substandard system performance, but MP3.0 methods determined the root cause of the problem. MP3.0 exploits the wealth of data with more sensitive performance indicators. Data analytics are powerful tools whose proper application could facilitate early intervention in degraded system performance.
format Online
Article
Text
id pubmed-6123149
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher John Wiley and Sons Inc.
record_format MEDLINE/PubMed
spelling pubmed-61231492018-09-10 Medical physics 3.0 versus 1.0: A case study in digital radiography quality control Carver, Diana E. Willis, Charles E. Stauduhar, Paul J. Nishino, Thomas K. Wells, Jered R. Samei, Ehsan J Appl Clin Med Phys Medical Imaging PURPOSE: The study illustrates how a renewed approach to medical physics, Medical Physics 3.0 (MP3.0), can identify performance decrement of digital radiography (DR) systems when conventional Medical Physics 1.0 (MP1.0) methods fail. METHODS: MP1.0 tests included traditional annual tests plus the manufacturer's automated Quality Assurance Procedures (QAP) of a DR system before and after a radiologist's image quality (IQ) complaint repeated after service intervention. Further analysis was conducted using nontraditional MP3.0 tests including longitudinal review of QAP results from a 15‐yr database, exposure‐dependent signal‐to‐noise (SNR (2)), clinical IQ, and correlation with the institutional service database. Clinical images were analyzed in terms of IQ metrics by the Duke University Clinical Imaging Physics Group using previously validated software. RESULTS: Traditional metrics did not indicate discrepant system performance at any time. QAP reported a decrease in contrast‐to‐noise ratio (CNR) after detector replacement, but remained above the manufacturer's action limit. Clinical images showed increased lung noise (Ln), mediastinum noise (Mn), and subdiaphragm‐lung contrast (SLc), and decreased lung gray level (Lgl) following detector replacement. After detector recalibration, QAP CNR improved, but did not return to previous levels. Lgl and SLc no longer significantly differed from before detector recalibration; however, Ln and Mn remained significantly different. Exposure‐dependent SNR (2) documented the detector operating within acceptable limits 9 yr previously but subsequently becoming miscalibrated sometime before four prior annual tests. Service records revealed catastrophic failure of the computer containing the original detector calibration from 11 yr prior. It is likely that the incorrect calibration backup file was uploaded at that time. CONCLUSIONS: MP1.0 tests failed to detect substandard system performance, but MP3.0 methods determined the root cause of the problem. MP3.0 exploits the wealth of data with more sensitive performance indicators. Data analytics are powerful tools whose proper application could facilitate early intervention in degraded system performance. John Wiley and Sons Inc. 2018-08-17 /pmc/articles/PMC6123149/ /pubmed/30117273 http://dx.doi.org/10.1002/acm2.12425 Text en © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine. This is an open access article under the terms of the http://creativecommons.org/licenses/by/4.0/ License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.
spellingShingle Medical Imaging
Carver, Diana E.
Willis, Charles E.
Stauduhar, Paul J.
Nishino, Thomas K.
Wells, Jered R.
Samei, Ehsan
Medical physics 3.0 versus 1.0: A case study in digital radiography quality control
title Medical physics 3.0 versus 1.0: A case study in digital radiography quality control
title_full Medical physics 3.0 versus 1.0: A case study in digital radiography quality control
title_fullStr Medical physics 3.0 versus 1.0: A case study in digital radiography quality control
title_full_unstemmed Medical physics 3.0 versus 1.0: A case study in digital radiography quality control
title_short Medical physics 3.0 versus 1.0: A case study in digital radiography quality control
title_sort medical physics 3.0 versus 1.0: a case study in digital radiography quality control
topic Medical Imaging
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6123149/
https://www.ncbi.nlm.nih.gov/pubmed/30117273
http://dx.doi.org/10.1002/acm2.12425
work_keys_str_mv AT carverdianae medicalphysics30versus10acasestudyindigitalradiographyqualitycontrol
AT willischarlese medicalphysics30versus10acasestudyindigitalradiographyqualitycontrol
AT stauduharpaulj medicalphysics30versus10acasestudyindigitalradiographyqualitycontrol
AT nishinothomask medicalphysics30versus10acasestudyindigitalradiographyqualitycontrol
AT wellsjeredr medicalphysics30versus10acasestudyindigitalradiographyqualitycontrol
AT sameiehsan medicalphysics30versus10acasestudyindigitalradiographyqualitycontrol