Cargando…
A deep learning-based self-adapting ensemble method for segmentation in gynecological brachytherapy
PURPOSE: Fast and accurate outlining of the organs at risk (OARs) and high-risk clinical tumor volume (HRCTV) is especially important in high-dose-rate brachytherapy due to the highly time-intensive online treatment planning process and the high dose gradient around the HRCTV. This study aims to app...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
BioMed Central
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9446699/ https://www.ncbi.nlm.nih.gov/pubmed/36064571 http://dx.doi.org/10.1186/s13014-022-02121-3 |
_version_ | 1784783700447199232 |
---|---|
author | Li, Zhen Zhu, Qingyuan Zhang, Lihua Yang, Xiaojing Li, Zhaobin Fu, Jie |
author_facet | Li, Zhen Zhu, Qingyuan Zhang, Lihua Yang, Xiaojing Li, Zhaobin Fu, Jie |
author_sort | Li, Zhen |
collection | PubMed |
description | PURPOSE: Fast and accurate outlining of the organs at risk (OARs) and high-risk clinical tumor volume (HRCTV) is especially important in high-dose-rate brachytherapy due to the highly time-intensive online treatment planning process and the high dose gradient around the HRCTV. This study aims to apply a self-configured ensemble method for fast and reproducible auto-segmentation of OARs and HRCTVs in gynecological cancer. MATERIALS AND METHODS: We applied nnU-Net (no new U-Net), an automatically adapted deep convolutional neural network based on U-Net, to segment the bladder, rectum and HRCTV on CT images in gynecological cancer. In nnU-Net, three architectures, including 2D U-Net, 3D U-Net and 3D-Cascade U-Net, were trained and finally ensembled. 207 cases were randomly chosen for training, and 30 for testing. Quantitative evaluation used well-established image segmentation metrics, including dice similarity coefficient (DSC), 95% Hausdorff distance (HD95%), and average surface distance (ASD). Qualitative analysis of automated segmentation results was performed visually by two radiation oncologists. The dosimetric evaluation was performed by comparing the dose-volume parameters of both predicted segmentation and human contouring. RESULTS: nnU-Net obtained high qualitative and quantitative segmentation accuracy on the test dataset and performed better than previously reported methods in bladder and rectum segmentation. In quantitative evaluation, 3D-Cascade achieved the best performance in the bladder (DSC: 0.936 ± 0.051, HD95%: 3.503 ± 1.956, ASD: 0.944 ± 0.503), rectum (DSC: 0.831 ± 0.074, HD95%: 7.579 ± 5.857, ASD: 3.6 ± 3.485), and HRCTV (DSC: 0.836 ± 0.07, HD95%: 7.42 ± 5.023, ASD: 2.094 ± 1.311). According to the qualitative evaluation, over 76% of the test data set had no or minor visually detectable errors in segmentation. CONCLUSION: This work showed nnU-Net’s superiority in segmenting OARs and HRCTV in gynecological brachytherapy cases in our center, among which 3D-Cascade shows the highest accuracy in segmentation across different applicators and patient anatomy. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1186/s13014-022-02121-3. |
format | Online Article Text |
id | pubmed-9446699 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | BioMed Central |
record_format | MEDLINE/PubMed |
spelling | pubmed-94466992022-09-07 A deep learning-based self-adapting ensemble method for segmentation in gynecological brachytherapy Li, Zhen Zhu, Qingyuan Zhang, Lihua Yang, Xiaojing Li, Zhaobin Fu, Jie Radiat Oncol Research PURPOSE: Fast and accurate outlining of the organs at risk (OARs) and high-risk clinical tumor volume (HRCTV) is especially important in high-dose-rate brachytherapy due to the highly time-intensive online treatment planning process and the high dose gradient around the HRCTV. This study aims to apply a self-configured ensemble method for fast and reproducible auto-segmentation of OARs and HRCTVs in gynecological cancer. MATERIALS AND METHODS: We applied nnU-Net (no new U-Net), an automatically adapted deep convolutional neural network based on U-Net, to segment the bladder, rectum and HRCTV on CT images in gynecological cancer. In nnU-Net, three architectures, including 2D U-Net, 3D U-Net and 3D-Cascade U-Net, were trained and finally ensembled. 207 cases were randomly chosen for training, and 30 for testing. Quantitative evaluation used well-established image segmentation metrics, including dice similarity coefficient (DSC), 95% Hausdorff distance (HD95%), and average surface distance (ASD). Qualitative analysis of automated segmentation results was performed visually by two radiation oncologists. The dosimetric evaluation was performed by comparing the dose-volume parameters of both predicted segmentation and human contouring. RESULTS: nnU-Net obtained high qualitative and quantitative segmentation accuracy on the test dataset and performed better than previously reported methods in bladder and rectum segmentation. In quantitative evaluation, 3D-Cascade achieved the best performance in the bladder (DSC: 0.936 ± 0.051, HD95%: 3.503 ± 1.956, ASD: 0.944 ± 0.503), rectum (DSC: 0.831 ± 0.074, HD95%: 7.579 ± 5.857, ASD: 3.6 ± 3.485), and HRCTV (DSC: 0.836 ± 0.07, HD95%: 7.42 ± 5.023, ASD: 2.094 ± 1.311). According to the qualitative evaluation, over 76% of the test data set had no or minor visually detectable errors in segmentation. CONCLUSION: This work showed nnU-Net’s superiority in segmenting OARs and HRCTV in gynecological brachytherapy cases in our center, among which 3D-Cascade shows the highest accuracy in segmentation across different applicators and patient anatomy. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1186/s13014-022-02121-3. BioMed Central 2022-09-05 /pmc/articles/PMC9446699/ /pubmed/36064571 http://dx.doi.org/10.1186/s13014-022-02121-3 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/ (https://creativecommons.org/publicdomain/zero/1.0/) ) applies to the data made available in this article, unless otherwise stated in a credit line to the data. |
spellingShingle | Research Li, Zhen Zhu, Qingyuan Zhang, Lihua Yang, Xiaojing Li, Zhaobin Fu, Jie A deep learning-based self-adapting ensemble method for segmentation in gynecological brachytherapy |
title | A deep learning-based self-adapting ensemble method for segmentation in gynecological brachytherapy |
title_full | A deep learning-based self-adapting ensemble method for segmentation in gynecological brachytherapy |
title_fullStr | A deep learning-based self-adapting ensemble method for segmentation in gynecological brachytherapy |
title_full_unstemmed | A deep learning-based self-adapting ensemble method for segmentation in gynecological brachytherapy |
title_short | A deep learning-based self-adapting ensemble method for segmentation in gynecological brachytherapy |
title_sort | deep learning-based self-adapting ensemble method for segmentation in gynecological brachytherapy |
topic | Research |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9446699/ https://www.ncbi.nlm.nih.gov/pubmed/36064571 http://dx.doi.org/10.1186/s13014-022-02121-3 |
work_keys_str_mv | AT lizhen adeeplearningbasedselfadaptingensemblemethodforsegmentationingynecologicalbrachytherapy AT zhuqingyuan adeeplearningbasedselfadaptingensemblemethodforsegmentationingynecologicalbrachytherapy AT zhanglihua adeeplearningbasedselfadaptingensemblemethodforsegmentationingynecologicalbrachytherapy AT yangxiaojing adeeplearningbasedselfadaptingensemblemethodforsegmentationingynecologicalbrachytherapy AT lizhaobin adeeplearningbasedselfadaptingensemblemethodforsegmentationingynecologicalbrachytherapy AT fujie adeeplearningbasedselfadaptingensemblemethodforsegmentationingynecologicalbrachytherapy AT lizhen deeplearningbasedselfadaptingensemblemethodforsegmentationingynecologicalbrachytherapy AT zhuqingyuan deeplearningbasedselfadaptingensemblemethodforsegmentationingynecologicalbrachytherapy AT zhanglihua deeplearningbasedselfadaptingensemblemethodforsegmentationingynecologicalbrachytherapy AT yangxiaojing deeplearningbasedselfadaptingensemblemethodforsegmentationingynecologicalbrachytherapy AT lizhaobin deeplearningbasedselfadaptingensemblemethodforsegmentationingynecologicalbrachytherapy AT fujie deeplearningbasedselfadaptingensemblemethodforsegmentationingynecologicalbrachytherapy |