Cargando…

Ultrasound image analysis using deep neural networks for discriminating between benign and malignant ovarian tumors: comparison with expert subjective assessment

OBJECTIVES: To develop and test the performance of computerized ultrasound image analysis using deep neural networks (DNNs) in discriminating between benign and malignant ovarian tumors and to compare its diagnostic accuracy with that of subjective assessment (SA) by an ultrasound expert. METHODS: W...

Descripción completa

Detalles Bibliográficos
Autores principales: Christiansen, F., Epstein, E. L., Smedberg, E., Åkerlund, M., Smith, K., Epstein, E.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: John Wiley & Sons, Ltd. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7839489/
https://www.ncbi.nlm.nih.gov/pubmed/33142359
http://dx.doi.org/10.1002/uog.23530
_version_ 1783643391913885696
author Christiansen, F.
Epstein, E. L.
Smedberg, E.
Åkerlund, M.
Smith, K.
Epstein, E.
author_facet Christiansen, F.
Epstein, E. L.
Smedberg, E.
Åkerlund, M.
Smith, K.
Epstein, E.
author_sort Christiansen, F.
collection PubMed
description OBJECTIVES: To develop and test the performance of computerized ultrasound image analysis using deep neural networks (DNNs) in discriminating between benign and malignant ovarian tumors and to compare its diagnostic accuracy with that of subjective assessment (SA) by an ultrasound expert. METHODS: We included 3077 (grayscale, n = 1927; power Doppler, n = 1150) ultrasound images from 758 women with ovarian tumors, who were classified prospectively by expert ultrasound examiners according to IOTA (International Ovarian Tumor Analysis) terms and definitions. Histological outcome from surgery (n = 634) or long‐term (≥ 3 years) follow‐up (n = 124) served as the gold standard. The dataset was split into a training set (n = 508; 314 benign and 194 malignant), a validation set (n = 100; 60 benign and 40 malignant) and a test set (n = 150; 75 benign and 75 malignant). We used transfer learning on three pre‐trained DNNs: VGG16, ResNet50 and MobileNet. Each model was trained, and the outputs calibrated, using temperature scaling. An ensemble of the three models was then used to estimate the probability of malignancy based on all images from a given case. The DNN ensemble classified the tumors as benign or malignant (Ovry‐Dx1 model); or as benign, inconclusive or malignant (Ovry‐Dx2 model). The diagnostic performance of the DNN models, in terms of sensitivity and specificity, was compared to that of SA for classifying ovarian tumors in the test set. RESULTS: At a sensitivity of 96.0%, Ovry‐Dx1 had a specificity similar to that of SA (86.7% vs 88.0%; P = 1.0). Ovry‐Dx2 had a sensitivity of 97.1% and a specificity of 93.7%, when designating 12.7% of the lesions as inconclusive. By complimenting Ovry‐Dx2 with SA in inconclusive cases, the overall sensitivity (96.0%) and specificity (89.3%) were not significantly different from using SA in all cases (P = 1.0). CONCLUSION: Ultrasound image analysis using DNNs can predict ovarian malignancy with a diagnostic accuracy comparable to that of human expert examiners, indicating that these models may have a role in the triage of women with an ovarian tumor. © 2020 The Authors. Ultrasound in Obstetrics & Gynecology published by John Wiley & Sons Ltd on behalf of International Society of Ultrasound in Obstetrics and Gynecology.
format Online
Article
Text
id pubmed-7839489
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher John Wiley & Sons, Ltd.
record_format MEDLINE/PubMed
spelling pubmed-78394892021-02-01 Ultrasound image analysis using deep neural networks for discriminating between benign and malignant ovarian tumors: comparison with expert subjective assessment Christiansen, F. Epstein, E. L. Smedberg, E. Åkerlund, M. Smith, K. Epstein, E. Ultrasound Obstet Gynecol Original Papers OBJECTIVES: To develop and test the performance of computerized ultrasound image analysis using deep neural networks (DNNs) in discriminating between benign and malignant ovarian tumors and to compare its diagnostic accuracy with that of subjective assessment (SA) by an ultrasound expert. METHODS: We included 3077 (grayscale, n = 1927; power Doppler, n = 1150) ultrasound images from 758 women with ovarian tumors, who were classified prospectively by expert ultrasound examiners according to IOTA (International Ovarian Tumor Analysis) terms and definitions. Histological outcome from surgery (n = 634) or long‐term (≥ 3 years) follow‐up (n = 124) served as the gold standard. The dataset was split into a training set (n = 508; 314 benign and 194 malignant), a validation set (n = 100; 60 benign and 40 malignant) and a test set (n = 150; 75 benign and 75 malignant). We used transfer learning on three pre‐trained DNNs: VGG16, ResNet50 and MobileNet. Each model was trained, and the outputs calibrated, using temperature scaling. An ensemble of the three models was then used to estimate the probability of malignancy based on all images from a given case. The DNN ensemble classified the tumors as benign or malignant (Ovry‐Dx1 model); or as benign, inconclusive or malignant (Ovry‐Dx2 model). The diagnostic performance of the DNN models, in terms of sensitivity and specificity, was compared to that of SA for classifying ovarian tumors in the test set. RESULTS: At a sensitivity of 96.0%, Ovry‐Dx1 had a specificity similar to that of SA (86.7% vs 88.0%; P = 1.0). Ovry‐Dx2 had a sensitivity of 97.1% and a specificity of 93.7%, when designating 12.7% of the lesions as inconclusive. By complimenting Ovry‐Dx2 with SA in inconclusive cases, the overall sensitivity (96.0%) and specificity (89.3%) were not significantly different from using SA in all cases (P = 1.0). CONCLUSION: Ultrasound image analysis using DNNs can predict ovarian malignancy with a diagnostic accuracy comparable to that of human expert examiners, indicating that these models may have a role in the triage of women with an ovarian tumor. © 2020 The Authors. Ultrasound in Obstetrics & Gynecology published by John Wiley & Sons Ltd on behalf of International Society of Ultrasound in Obstetrics and Gynecology. John Wiley & Sons, Ltd. 2021-01-02 2021-01 /pmc/articles/PMC7839489/ /pubmed/33142359 http://dx.doi.org/10.1002/uog.23530 Text en © 2020 The Authors. Ultrasound in Obstetrics & Gynecology published by John Wiley & Sons Ltd on behalf of International Society of Ultrasound in Obstetrics and Gynecology. This is an open access article under the terms of the http://creativecommons.org/licenses/by/4.0/ License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.
spellingShingle Original Papers
Christiansen, F.
Epstein, E. L.
Smedberg, E.
Åkerlund, M.
Smith, K.
Epstein, E.
Ultrasound image analysis using deep neural networks for discriminating between benign and malignant ovarian tumors: comparison with expert subjective assessment
title Ultrasound image analysis using deep neural networks for discriminating between benign and malignant ovarian tumors: comparison with expert subjective assessment
title_full Ultrasound image analysis using deep neural networks for discriminating between benign and malignant ovarian tumors: comparison with expert subjective assessment
title_fullStr Ultrasound image analysis using deep neural networks for discriminating between benign and malignant ovarian tumors: comparison with expert subjective assessment
title_full_unstemmed Ultrasound image analysis using deep neural networks for discriminating between benign and malignant ovarian tumors: comparison with expert subjective assessment
title_short Ultrasound image analysis using deep neural networks for discriminating between benign and malignant ovarian tumors: comparison with expert subjective assessment
title_sort ultrasound image analysis using deep neural networks for discriminating between benign and malignant ovarian tumors: comparison with expert subjective assessment
topic Original Papers
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7839489/
https://www.ncbi.nlm.nih.gov/pubmed/33142359
http://dx.doi.org/10.1002/uog.23530
work_keys_str_mv AT christiansenf ultrasoundimageanalysisusingdeepneuralnetworksfordiscriminatingbetweenbenignandmalignantovariantumorscomparisonwithexpertsubjectiveassessment
AT epsteinel ultrasoundimageanalysisusingdeepneuralnetworksfordiscriminatingbetweenbenignandmalignantovariantumorscomparisonwithexpertsubjectiveassessment
AT smedberge ultrasoundimageanalysisusingdeepneuralnetworksfordiscriminatingbetweenbenignandmalignantovariantumorscomparisonwithexpertsubjectiveassessment
AT akerlundm ultrasoundimageanalysisusingdeepneuralnetworksfordiscriminatingbetweenbenignandmalignantovariantumorscomparisonwithexpertsubjectiveassessment
AT smithk ultrasoundimageanalysisusingdeepneuralnetworksfordiscriminatingbetweenbenignandmalignantovariantumorscomparisonwithexpertsubjectiveassessment
AT epsteine ultrasoundimageanalysisusingdeepneuralnetworksfordiscriminatingbetweenbenignandmalignantovariantumorscomparisonwithexpertsubjectiveassessment