Cargando…

BI-RADS-Based Classification of Mammographic Soft Tissue Opacities Using a Deep Convolutional Neural Network

The aim of this study was to investigate the potential of a machine learning algorithm to classify breast cancer solely by the presence of soft tissue opacities in mammograms, independent of other morphological features, using a deep convolutional neural network (dCNN). Soft tissue opacities were cl...

Descripción completa

Detalles Bibliográficos
Autores principales: Sabani, Albin, Landsmann, Anna, Hejduk, Patryk, Schmidt, Cynthia, Marcon, Magda, Borkowski, Karol, Rossi, Cristina, Ciritsis, Alexander, Boss, Andreas
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9318280/
https://www.ncbi.nlm.nih.gov/pubmed/35885470
http://dx.doi.org/10.3390/diagnostics12071564
_version_ 1784755252139917312
author Sabani, Albin
Landsmann, Anna
Hejduk, Patryk
Schmidt, Cynthia
Marcon, Magda
Borkowski, Karol
Rossi, Cristina
Ciritsis, Alexander
Boss, Andreas
author_facet Sabani, Albin
Landsmann, Anna
Hejduk, Patryk
Schmidt, Cynthia
Marcon, Magda
Borkowski, Karol
Rossi, Cristina
Ciritsis, Alexander
Boss, Andreas
author_sort Sabani, Albin
collection PubMed
description The aim of this study was to investigate the potential of a machine learning algorithm to classify breast cancer solely by the presence of soft tissue opacities in mammograms, independent of other morphological features, using a deep convolutional neural network (dCNN). Soft tissue opacities were classified based on their radiological appearance using the ACR BI-RADS atlas. We included 1744 mammograms from 438 patients to create 7242 icons by manual labeling. The icons were sorted into three categories: “no opacities” (BI-RADS 1), “probably benign opacities” (BI-RADS 2/3) and “suspicious opacities” (BI-RADS 4/5). A dCNN was trained (70% of data), validated (20%) and finally tested (10%). A sliding window approach was applied to create colored probability maps for visual impression. Diagnostic performance of the dCNN was compared to human readout by experienced radiologists on a “real-world” dataset. The accuracies of the models on the test dataset ranged between 73.8% and 89.8%. Compared to human readout, our dCNN achieved a higher specificity (100%, 95% CI: 85.4–100%; reader 1: 86.2%, 95% CI: 67.4–95.5%; reader 2: 79.3%, 95% CI: 59.7–91.3%), and the sensitivity (84.0%, 95% CI: 63.9–95.5%) was lower than that of human readers (reader 1:88.0%, 95% CI: 67.4–95.4%; reader 2:88.0%, 95% CI: 67.7–96.8%). In conclusion, a dCNN can be used for the automatic detection as well as the standardized and observer-independent classification of soft tissue opacities in mammograms independent of the presence of microcalcifications. Human decision making in accordance with the BI-RADS classification can be mimicked by artificial intelligence.
format Online
Article
Text
id pubmed-9318280
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-93182802022-07-27 BI-RADS-Based Classification of Mammographic Soft Tissue Opacities Using a Deep Convolutional Neural Network Sabani, Albin Landsmann, Anna Hejduk, Patryk Schmidt, Cynthia Marcon, Magda Borkowski, Karol Rossi, Cristina Ciritsis, Alexander Boss, Andreas Diagnostics (Basel) Article The aim of this study was to investigate the potential of a machine learning algorithm to classify breast cancer solely by the presence of soft tissue opacities in mammograms, independent of other morphological features, using a deep convolutional neural network (dCNN). Soft tissue opacities were classified based on their radiological appearance using the ACR BI-RADS atlas. We included 1744 mammograms from 438 patients to create 7242 icons by manual labeling. The icons were sorted into three categories: “no opacities” (BI-RADS 1), “probably benign opacities” (BI-RADS 2/3) and “suspicious opacities” (BI-RADS 4/5). A dCNN was trained (70% of data), validated (20%) and finally tested (10%). A sliding window approach was applied to create colored probability maps for visual impression. Diagnostic performance of the dCNN was compared to human readout by experienced radiologists on a “real-world” dataset. The accuracies of the models on the test dataset ranged between 73.8% and 89.8%. Compared to human readout, our dCNN achieved a higher specificity (100%, 95% CI: 85.4–100%; reader 1: 86.2%, 95% CI: 67.4–95.5%; reader 2: 79.3%, 95% CI: 59.7–91.3%), and the sensitivity (84.0%, 95% CI: 63.9–95.5%) was lower than that of human readers (reader 1:88.0%, 95% CI: 67.4–95.4%; reader 2:88.0%, 95% CI: 67.7–96.8%). In conclusion, a dCNN can be used for the automatic detection as well as the standardized and observer-independent classification of soft tissue opacities in mammograms independent of the presence of microcalcifications. Human decision making in accordance with the BI-RADS classification can be mimicked by artificial intelligence. MDPI 2022-06-28 /pmc/articles/PMC9318280/ /pubmed/35885470 http://dx.doi.org/10.3390/diagnostics12071564 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Sabani, Albin
Landsmann, Anna
Hejduk, Patryk
Schmidt, Cynthia
Marcon, Magda
Borkowski, Karol
Rossi, Cristina
Ciritsis, Alexander
Boss, Andreas
BI-RADS-Based Classification of Mammographic Soft Tissue Opacities Using a Deep Convolutional Neural Network
title BI-RADS-Based Classification of Mammographic Soft Tissue Opacities Using a Deep Convolutional Neural Network
title_full BI-RADS-Based Classification of Mammographic Soft Tissue Opacities Using a Deep Convolutional Neural Network
title_fullStr BI-RADS-Based Classification of Mammographic Soft Tissue Opacities Using a Deep Convolutional Neural Network
title_full_unstemmed BI-RADS-Based Classification of Mammographic Soft Tissue Opacities Using a Deep Convolutional Neural Network
title_short BI-RADS-Based Classification of Mammographic Soft Tissue Opacities Using a Deep Convolutional Neural Network
title_sort bi-rads-based classification of mammographic soft tissue opacities using a deep convolutional neural network
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9318280/
https://www.ncbi.nlm.nih.gov/pubmed/35885470
http://dx.doi.org/10.3390/diagnostics12071564
work_keys_str_mv AT sabanialbin biradsbasedclassificationofmammographicsofttissueopacitiesusingadeepconvolutionalneuralnetwork
AT landsmannanna biradsbasedclassificationofmammographicsofttissueopacitiesusingadeepconvolutionalneuralnetwork
AT hejdukpatryk biradsbasedclassificationofmammographicsofttissueopacitiesusingadeepconvolutionalneuralnetwork
AT schmidtcynthia biradsbasedclassificationofmammographicsofttissueopacitiesusingadeepconvolutionalneuralnetwork
AT marconmagda biradsbasedclassificationofmammographicsofttissueopacitiesusingadeepconvolutionalneuralnetwork
AT borkowskikarol biradsbasedclassificationofmammographicsofttissueopacitiesusingadeepconvolutionalneuralnetwork
AT rossicristina biradsbasedclassificationofmammographicsofttissueopacitiesusingadeepconvolutionalneuralnetwork
AT ciritsisalexander biradsbasedclassificationofmammographicsofttissueopacitiesusingadeepconvolutionalneuralnetwork
AT bossandreas biradsbasedclassificationofmammographicsofttissueopacitiesusingadeepconvolutionalneuralnetwork