Cargando…
Deep learning-based segmentation of the thorax in mouse micro-CT scans
For image-guided small animal irradiations, the whole workflow of imaging, organ contouring, irradiation planning, and delivery is typically performed in a single session requiring continuous administration of anaesthetic agents. Automating contouring leads to a faster workflow, which limits exposur...
Autores principales: | , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8810936/ https://www.ncbi.nlm.nih.gov/pubmed/35110676 http://dx.doi.org/10.1038/s41598-022-05868-7 |
_version_ | 1784644334780416000 |
---|---|
author | Malimban, Justin Lathouwers, Danny Qian, Haibin Verhaegen, Frank Wiedemann, Julia Brandenburg, Sytze Staring, Marius |
author_facet | Malimban, Justin Lathouwers, Danny Qian, Haibin Verhaegen, Frank Wiedemann, Julia Brandenburg, Sytze Staring, Marius |
author_sort | Malimban, Justin |
collection | PubMed |
description | For image-guided small animal irradiations, the whole workflow of imaging, organ contouring, irradiation planning, and delivery is typically performed in a single session requiring continuous administration of anaesthetic agents. Automating contouring leads to a faster workflow, which limits exposure to anaesthesia and thereby, reducing its impact on experimental results and on animal wellbeing. Here, we trained the 2D and 3D U-Net architectures of no-new-Net (nnU-Net) for autocontouring of the thorax in mouse micro-CT images. We trained the models only on native CTs and evaluated their performance using an independent testing dataset (i.e., native CTs not included in the training and validation). Unlike previous studies, we also tested the model performance on an external dataset (i.e., contrast-enhanced CTs) to see how well they predict on CTs completely different from what they were trained on. We also assessed the interobserver variability using the generalized conformity index ([Formula: see text] ) among three observers, providing a stronger human baseline for evaluating automated contours than previous studies. Lastly, we showed the benefit on the contouring time compared to manual contouring. The results show that 3D models of nnU-Net achieve superior segmentation accuracy and are more robust to unseen data than 2D models. For all target organs, the mean surface distance (MSD) and the Hausdorff distance (95p HD) of the best performing model for this task (nnU-Net 3d_fullres) are within 0.16 mm and 0.60 mm, respectively. These values are below the minimum required contouring accuracy of 1 mm for small animal irradiations, and improve significantly upon state-of-the-art 2D U-Net-based AIMOS method. Moreover, the conformity indices of the 3d_fullres model also compare favourably to the interobserver variability for all target organs, whereas the 2D models perform poorly in this regard. Importantly, the 3d_fullres model offers 98% reduction in contouring time. |
format | Online Article Text |
id | pubmed-8810936 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-88109362022-02-07 Deep learning-based segmentation of the thorax in mouse micro-CT scans Malimban, Justin Lathouwers, Danny Qian, Haibin Verhaegen, Frank Wiedemann, Julia Brandenburg, Sytze Staring, Marius Sci Rep Article For image-guided small animal irradiations, the whole workflow of imaging, organ contouring, irradiation planning, and delivery is typically performed in a single session requiring continuous administration of anaesthetic agents. Automating contouring leads to a faster workflow, which limits exposure to anaesthesia and thereby, reducing its impact on experimental results and on animal wellbeing. Here, we trained the 2D and 3D U-Net architectures of no-new-Net (nnU-Net) for autocontouring of the thorax in mouse micro-CT images. We trained the models only on native CTs and evaluated their performance using an independent testing dataset (i.e., native CTs not included in the training and validation). Unlike previous studies, we also tested the model performance on an external dataset (i.e., contrast-enhanced CTs) to see how well they predict on CTs completely different from what they were trained on. We also assessed the interobserver variability using the generalized conformity index ([Formula: see text] ) among three observers, providing a stronger human baseline for evaluating automated contours than previous studies. Lastly, we showed the benefit on the contouring time compared to manual contouring. The results show that 3D models of nnU-Net achieve superior segmentation accuracy and are more robust to unseen data than 2D models. For all target organs, the mean surface distance (MSD) and the Hausdorff distance (95p HD) of the best performing model for this task (nnU-Net 3d_fullres) are within 0.16 mm and 0.60 mm, respectively. These values are below the minimum required contouring accuracy of 1 mm for small animal irradiations, and improve significantly upon state-of-the-art 2D U-Net-based AIMOS method. Moreover, the conformity indices of the 3d_fullres model also compare favourably to the interobserver variability for all target organs, whereas the 2D models perform poorly in this regard. Importantly, the 3d_fullres model offers 98% reduction in contouring time. Nature Publishing Group UK 2022-02-02 /pmc/articles/PMC8810936/ /pubmed/35110676 http://dx.doi.org/10.1038/s41598-022-05868-7 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Article Malimban, Justin Lathouwers, Danny Qian, Haibin Verhaegen, Frank Wiedemann, Julia Brandenburg, Sytze Staring, Marius Deep learning-based segmentation of the thorax in mouse micro-CT scans |
title | Deep learning-based segmentation of the thorax in mouse micro-CT scans |
title_full | Deep learning-based segmentation of the thorax in mouse micro-CT scans |
title_fullStr | Deep learning-based segmentation of the thorax in mouse micro-CT scans |
title_full_unstemmed | Deep learning-based segmentation of the thorax in mouse micro-CT scans |
title_short | Deep learning-based segmentation of the thorax in mouse micro-CT scans |
title_sort | deep learning-based segmentation of the thorax in mouse micro-ct scans |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8810936/ https://www.ncbi.nlm.nih.gov/pubmed/35110676 http://dx.doi.org/10.1038/s41598-022-05868-7 |
work_keys_str_mv | AT malimbanjustin deeplearningbasedsegmentationofthethoraxinmousemicroctscans AT lathouwersdanny deeplearningbasedsegmentationofthethoraxinmousemicroctscans AT qianhaibin deeplearningbasedsegmentationofthethoraxinmousemicroctscans AT verhaegenfrank deeplearningbasedsegmentationofthethoraxinmousemicroctscans AT wiedemannjulia deeplearningbasedsegmentationofthethoraxinmousemicroctscans AT brandenburgsytze deeplearningbasedsegmentationofthethoraxinmousemicroctscans AT staringmarius deeplearningbasedsegmentationofthethoraxinmousemicroctscans |