Cargando…

Deep-learning segmentation of ultrasound images for automated calculation of the hydronephrosis area to renal parenchyma ratio

PURPOSE: We investigated the feasibility of measuring the hydronephrosis area to renal parenchyma (HARP) ratio from ultrasound images using a deep-learning network. MATERIALS AND METHODS: The coronal renal ultrasound images of 195 pediatric and adolescent patients who underwent pyeloplasty to repair...

Descripción completa

Detalles Bibliográficos
Autores principales: Song, Sang Hoon, Han, Jae Hyeon, Kim, Kun Suk, Cho, Young Ah, Youn, Hye Jung, Kim, Young In, Kweon, Jihoon
Formato: Online Artículo Texto
Lenguaje:English
Publicado: The Korean Urological Association 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9262488/
https://www.ncbi.nlm.nih.gov/pubmed/35670007
http://dx.doi.org/10.4111/icu.20220085
_version_ 1784742506748968960
author Song, Sang Hoon
Han, Jae Hyeon
Kim, Kun Suk
Cho, Young Ah
Youn, Hye Jung
Kim, Young In
Kweon, Jihoon
author_facet Song, Sang Hoon
Han, Jae Hyeon
Kim, Kun Suk
Cho, Young Ah
Youn, Hye Jung
Kim, Young In
Kweon, Jihoon
author_sort Song, Sang Hoon
collection PubMed
description PURPOSE: We investigated the feasibility of measuring the hydronephrosis area to renal parenchyma (HARP) ratio from ultrasound images using a deep-learning network. MATERIALS AND METHODS: The coronal renal ultrasound images of 195 pediatric and adolescent patients who underwent pyeloplasty to repair ureteropelvic junction obstruction were retrospectively reviewed. After excluding cases without a representative longitudinal renal image, we used a dataset of 168 images for deep-learning segmentation. Ten novel networks, such as combinations of DeepLabV3+ and UNet++, were assessed for their ability to calculate hydronephrosis and kidney areas, and the ensemble method was applied for further improvement. By dividing the image set into four, cross-validation was conducted, and the segmentation performance of the deep-learning network was evaluated using sensitivity, specificity, and dice similarity coefficients by comparison with the manually traced area. RESULTS: All 10 networks and ensemble methods showed good visual correlation with the manually traced kidney and hydronephrosis areas. The dice similarity coefficient of the 10-model ensemble was 0.9108 on average, and the best 5-model ensemble had a dice similarity coefficient of 0.9113 on average. We included patients with severe hydronephrosis who underwent renal ultrasonography at a single institution; thus, external validation of our algorithm in a heterogeneous ultrasonography examination setup with a diverse set of instruments is recommended. CONCLUSIONS: Deep-learning-based calculation of the HARP ratio is feasible and showed high accuracy for imaging of the severity of hydronephrosis using ultrasonography. This algorithm can help physicians make more accurate and reproducible diagnoses of hydronephrosis using ultrasonography.
format Online
Article
Text
id pubmed-9262488
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher The Korean Urological Association
record_format MEDLINE/PubMed
spelling pubmed-92624882022-07-13 Deep-learning segmentation of ultrasound images for automated calculation of the hydronephrosis area to renal parenchyma ratio Song, Sang Hoon Han, Jae Hyeon Kim, Kun Suk Cho, Young Ah Youn, Hye Jung Kim, Young In Kweon, Jihoon Investig Clin Urol Original Article PURPOSE: We investigated the feasibility of measuring the hydronephrosis area to renal parenchyma (HARP) ratio from ultrasound images using a deep-learning network. MATERIALS AND METHODS: The coronal renal ultrasound images of 195 pediatric and adolescent patients who underwent pyeloplasty to repair ureteropelvic junction obstruction were retrospectively reviewed. After excluding cases without a representative longitudinal renal image, we used a dataset of 168 images for deep-learning segmentation. Ten novel networks, such as combinations of DeepLabV3+ and UNet++, were assessed for their ability to calculate hydronephrosis and kidney areas, and the ensemble method was applied for further improvement. By dividing the image set into four, cross-validation was conducted, and the segmentation performance of the deep-learning network was evaluated using sensitivity, specificity, and dice similarity coefficients by comparison with the manually traced area. RESULTS: All 10 networks and ensemble methods showed good visual correlation with the manually traced kidney and hydronephrosis areas. The dice similarity coefficient of the 10-model ensemble was 0.9108 on average, and the best 5-model ensemble had a dice similarity coefficient of 0.9113 on average. We included patients with severe hydronephrosis who underwent renal ultrasonography at a single institution; thus, external validation of our algorithm in a heterogeneous ultrasonography examination setup with a diverse set of instruments is recommended. CONCLUSIONS: Deep-learning-based calculation of the HARP ratio is feasible and showed high accuracy for imaging of the severity of hydronephrosis using ultrasonography. This algorithm can help physicians make more accurate and reproducible diagnoses of hydronephrosis using ultrasonography. The Korean Urological Association 2022-07 2022-05-25 /pmc/articles/PMC9262488/ /pubmed/35670007 http://dx.doi.org/10.4111/icu.20220085 Text en © The Korean Urological Association https://creativecommons.org/licenses/by-nc/4.0/This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/4.0 (https://creativecommons.org/licenses/by-nc/4.0/) ) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Original Article
Song, Sang Hoon
Han, Jae Hyeon
Kim, Kun Suk
Cho, Young Ah
Youn, Hye Jung
Kim, Young In
Kweon, Jihoon
Deep-learning segmentation of ultrasound images for automated calculation of the hydronephrosis area to renal parenchyma ratio
title Deep-learning segmentation of ultrasound images for automated calculation of the hydronephrosis area to renal parenchyma ratio
title_full Deep-learning segmentation of ultrasound images for automated calculation of the hydronephrosis area to renal parenchyma ratio
title_fullStr Deep-learning segmentation of ultrasound images for automated calculation of the hydronephrosis area to renal parenchyma ratio
title_full_unstemmed Deep-learning segmentation of ultrasound images for automated calculation of the hydronephrosis area to renal parenchyma ratio
title_short Deep-learning segmentation of ultrasound images for automated calculation of the hydronephrosis area to renal parenchyma ratio
title_sort deep-learning segmentation of ultrasound images for automated calculation of the hydronephrosis area to renal parenchyma ratio
topic Original Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9262488/
https://www.ncbi.nlm.nih.gov/pubmed/35670007
http://dx.doi.org/10.4111/icu.20220085
work_keys_str_mv AT songsanghoon deeplearningsegmentationofultrasoundimagesforautomatedcalculationofthehydronephrosisareatorenalparenchymaratio
AT hanjaehyeon deeplearningsegmentationofultrasoundimagesforautomatedcalculationofthehydronephrosisareatorenalparenchymaratio
AT kimkunsuk deeplearningsegmentationofultrasoundimagesforautomatedcalculationofthehydronephrosisareatorenalparenchymaratio
AT choyoungah deeplearningsegmentationofultrasoundimagesforautomatedcalculationofthehydronephrosisareatorenalparenchymaratio
AT younhyejung deeplearningsegmentationofultrasoundimagesforautomatedcalculationofthehydronephrosisareatorenalparenchymaratio
AT kimyoungin deeplearningsegmentationofultrasoundimagesforautomatedcalculationofthehydronephrosisareatorenalparenchymaratio
AT kweonjihoon deeplearningsegmentationofultrasoundimagesforautomatedcalculationofthehydronephrosisareatorenalparenchymaratio