Cargando…

A generative adversarial network for synthetization of regions of interest based on digital mammograms

Deep learning (DL) models are becoming pervasive and applicable to computer vision, image processing, and synthesis problems. The performance of these models is often improved through architectural configuration, tweaks, the use of enormous training data, and skillful selection of hyperparameters. T...

Descripción completa

Detalles Bibliográficos
Autores principales: Oyelade, Olaide N., Ezugwu, Absalom E., Almutairi, Mubarak S., Saha, Apu Kumar, Abualigah, Laith, Chiroma, Haruna
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9008034/
https://www.ncbi.nlm.nih.gov/pubmed/35418566
http://dx.doi.org/10.1038/s41598-022-09929-9
_version_ 1784686963268255744
author Oyelade, Olaide N.
Ezugwu, Absalom E.
Almutairi, Mubarak S.
Saha, Apu Kumar
Abualigah, Laith
Chiroma, Haruna
author_facet Oyelade, Olaide N.
Ezugwu, Absalom E.
Almutairi, Mubarak S.
Saha, Apu Kumar
Abualigah, Laith
Chiroma, Haruna
author_sort Oyelade, Olaide N.
collection PubMed
description Deep learning (DL) models are becoming pervasive and applicable to computer vision, image processing, and synthesis problems. The performance of these models is often improved through architectural configuration, tweaks, the use of enormous training data, and skillful selection of hyperparameters. The application of deep learning models to medical image processing has yielded interesting performance, capable of correctly detecting abnormalities in medical digital images, making them surpass human physicians. However, advancing research in this domain largely relies on the availability of training datasets. These datasets are sometimes not publicly accessible, insufficient for training, and may also be characterized by a class imbalance among samples. As a result, inadequate training samples and difficulty in accessing new datasets for training deep learning models limit performance and research into new domains. Hence, generative adversarial networks (GANs) have been proposed to mediate this gap by synthesizing data similar to real sample images. However, we observed that benchmark datasets with regions of interest (ROIs) for characterizing abnormalities in breast cancer using digital mammography do not contain sufficient data with a fair distribution of all cases of abnormalities. For instance, the architectural distortion and breast asymmetry in digital mammograms are sparsely distributed across most publicly available datasets. This paper proposes a GAN model, named ROImammoGAN, which synthesizes ROI-based digital mammograms. Our approach involves the design of a GAN model consisting of both a generator and a discriminator to learn a hierarchy of representations for abnormalities in digital mammograms. Attention is given to architectural distortion, asymmetry, mass, and microcalcification abnormalities so that training distinctively learns the features of each abnormality and generates sufficient images for each category. The proposed GAN model was applied to MIAS datasets, and the performance evaluation yielded a competitive accuracy for the synthesized samples. In addition, the quality of the images generated was also evaluated using PSNR, SSIM, FSIM, BRISQUE, PQUE, NIQUE, FID, and geometry scores. The results showed that ROImammoGAN performed competitively with state-of-the-art GANs. The outcome of this study is a model for augmenting CNN models with ROI-centric image samples for the characterization of abnormalities in breast images.
format Online
Article
Text
id pubmed-9008034
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-90080342022-04-15 A generative adversarial network for synthetization of regions of interest based on digital mammograms Oyelade, Olaide N. Ezugwu, Absalom E. Almutairi, Mubarak S. Saha, Apu Kumar Abualigah, Laith Chiroma, Haruna Sci Rep Article Deep learning (DL) models are becoming pervasive and applicable to computer vision, image processing, and synthesis problems. The performance of these models is often improved through architectural configuration, tweaks, the use of enormous training data, and skillful selection of hyperparameters. The application of deep learning models to medical image processing has yielded interesting performance, capable of correctly detecting abnormalities in medical digital images, making them surpass human physicians. However, advancing research in this domain largely relies on the availability of training datasets. These datasets are sometimes not publicly accessible, insufficient for training, and may also be characterized by a class imbalance among samples. As a result, inadequate training samples and difficulty in accessing new datasets for training deep learning models limit performance and research into new domains. Hence, generative adversarial networks (GANs) have been proposed to mediate this gap by synthesizing data similar to real sample images. However, we observed that benchmark datasets with regions of interest (ROIs) for characterizing abnormalities in breast cancer using digital mammography do not contain sufficient data with a fair distribution of all cases of abnormalities. For instance, the architectural distortion and breast asymmetry in digital mammograms are sparsely distributed across most publicly available datasets. This paper proposes a GAN model, named ROImammoGAN, which synthesizes ROI-based digital mammograms. Our approach involves the design of a GAN model consisting of both a generator and a discriminator to learn a hierarchy of representations for abnormalities in digital mammograms. Attention is given to architectural distortion, asymmetry, mass, and microcalcification abnormalities so that training distinctively learns the features of each abnormality and generates sufficient images for each category. The proposed GAN model was applied to MIAS datasets, and the performance evaluation yielded a competitive accuracy for the synthesized samples. In addition, the quality of the images generated was also evaluated using PSNR, SSIM, FSIM, BRISQUE, PQUE, NIQUE, FID, and geometry scores. The results showed that ROImammoGAN performed competitively with state-of-the-art GANs. The outcome of this study is a model for augmenting CNN models with ROI-centric image samples for the characterization of abnormalities in breast images. Nature Publishing Group UK 2022-04-13 /pmc/articles/PMC9008034/ /pubmed/35418566 http://dx.doi.org/10.1038/s41598-022-09929-9 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Oyelade, Olaide N.
Ezugwu, Absalom E.
Almutairi, Mubarak S.
Saha, Apu Kumar
Abualigah, Laith
Chiroma, Haruna
A generative adversarial network for synthetization of regions of interest based on digital mammograms
title A generative adversarial network for synthetization of regions of interest based on digital mammograms
title_full A generative adversarial network for synthetization of regions of interest based on digital mammograms
title_fullStr A generative adversarial network for synthetization of regions of interest based on digital mammograms
title_full_unstemmed A generative adversarial network for synthetization of regions of interest based on digital mammograms
title_short A generative adversarial network for synthetization of regions of interest based on digital mammograms
title_sort generative adversarial network for synthetization of regions of interest based on digital mammograms
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9008034/
https://www.ncbi.nlm.nih.gov/pubmed/35418566
http://dx.doi.org/10.1038/s41598-022-09929-9
work_keys_str_mv AT oyeladeolaiden agenerativeadversarialnetworkforsynthetizationofregionsofinterestbasedondigitalmammograms
AT ezugwuabsalome agenerativeadversarialnetworkforsynthetizationofregionsofinterestbasedondigitalmammograms
AT almutairimubaraks agenerativeadversarialnetworkforsynthetizationofregionsofinterestbasedondigitalmammograms
AT sahaapukumar agenerativeadversarialnetworkforsynthetizationofregionsofinterestbasedondigitalmammograms
AT abualigahlaith agenerativeadversarialnetworkforsynthetizationofregionsofinterestbasedondigitalmammograms
AT chiromaharuna agenerativeadversarialnetworkforsynthetizationofregionsofinterestbasedondigitalmammograms
AT oyeladeolaiden generativeadversarialnetworkforsynthetizationofregionsofinterestbasedondigitalmammograms
AT ezugwuabsalome generativeadversarialnetworkforsynthetizationofregionsofinterestbasedondigitalmammograms
AT almutairimubaraks generativeadversarialnetworkforsynthetizationofregionsofinterestbasedondigitalmammograms
AT sahaapukumar generativeadversarialnetworkforsynthetizationofregionsofinterestbasedondigitalmammograms
AT abualigahlaith generativeadversarialnetworkforsynthetizationofregionsofinterestbasedondigitalmammograms
AT chiromaharuna generativeadversarialnetworkforsynthetizationofregionsofinterestbasedondigitalmammograms