Cargando…

Development and validation of a checklist for use with automatically generated radiotherapy plans

PURPOSE: To develop a checklist that improves the rate of error detection during the plan review of automatically generated radiotherapy plans. METHODS: A custom checklist was developed using guidance from American Association of Physicists in Medicine task groups 275 and 315 and the results of a fa...

Descripción completa

Detalles Bibliográficos
Autores principales: Nealon, Kelly A., Court, Laurence E., Douglas, Raphael J., Zhang, Lifei, Han, Eun Young
Formato: Online Artículo Texto
Lenguaje:English
Publicado: John Wiley and Sons Inc. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9512344/
https://www.ncbi.nlm.nih.gov/pubmed/35775105
http://dx.doi.org/10.1002/acm2.13694
Descripción
Sumario:PURPOSE: To develop a checklist that improves the rate of error detection during the plan review of automatically generated radiotherapy plans. METHODS: A custom checklist was developed using guidance from American Association of Physicists in Medicine task groups 275 and 315 and the results of a failure modes and effects analysis of the Radiation Planning Assistant (RPA), an automated contouring and treatment planning tool. The preliminary checklist contained 90 review items for each automatically generated plan. In the first study, eight physicists were recruited from our institution who were familiar with the RPA. Each physicist reviewed 10 artificial intelligence‐generated resident treatment plans from the RPA for safety and plan quality, five of which contained errors. Physicists performed plan checks, recorded errors, and rated each plan's clinical acceptability. Following a 2‐week break, physicists reviewed 10 additional plans with a similar distribution of errors using our customized checklist. Participants then provided feedback on the usability of the checklist and it was modified accordingly. In a second study, this process was repeated with 14 senior medical physics residents who were randomly assigned to checklist or no checklist for their reviews. Each reviewed 10 plans, five of which contained errors, and completed the corresponding survey. RESULTS: In the first study, the checklist significantly improved the rate of error detection from 3.4 ± 1.1 to 4.4 ± 0.74 errors per participant without and with the checklist, respectively (p = 0.02). Error detection increased by 20% when the custom checklist was utilized. In the second study, 2.9 ± 0.84 and 3.5 ± 0.84 errors per participant were detected without and with the revised checklist, respectively (p = 0.08). Despite the lack of statistical significance for this cohort, error detection increased by 18% when the checklist was utilized. CONCLUSION: Our results indicate that the use of a customized checklist when reviewing automated treatment plans will result in improved patient safety.