Cargando…

Generative adversarial networks based skin lesion segmentation

Skin cancer is a serious condition that requires accurate diagnosis and treatment. One way to assist clinicians in this task is using computer-aided diagnosis tools that automatically segment skin lesions from dermoscopic images. We propose a novel adversarial learning-based framework called Efficie...

Descripción completa

Detalles Bibliográficos
Autores principales: Innani, Shubham, Dutande, Prasad, Baid, Ujjwal, Pokuri, Venu, Bakas, Spyridon, Talbar, Sanjay, Baheti, Bhakti, Guntuku, Sharath Chandra
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10439152/
https://www.ncbi.nlm.nih.gov/pubmed/37596306
http://dx.doi.org/10.1038/s41598-023-39648-8
_version_ 1785092880647323648
author Innani, Shubham
Dutande, Prasad
Baid, Ujjwal
Pokuri, Venu
Bakas, Spyridon
Talbar, Sanjay
Baheti, Bhakti
Guntuku, Sharath Chandra
author_facet Innani, Shubham
Dutande, Prasad
Baid, Ujjwal
Pokuri, Venu
Bakas, Spyridon
Talbar, Sanjay
Baheti, Bhakti
Guntuku, Sharath Chandra
author_sort Innani, Shubham
collection PubMed
description Skin cancer is a serious condition that requires accurate diagnosis and treatment. One way to assist clinicians in this task is using computer-aided diagnosis tools that automatically segment skin lesions from dermoscopic images. We propose a novel adversarial learning-based framework called Efficient-GAN (EGAN) that uses an unsupervised generative network to generate accurate lesion masks. It consists of a generator module with a top-down squeeze excitation-based compound scaled path, an asymmetric lateral connection-based bottom-up path, and a discriminator module that distinguishes between original and synthetic masks. A morphology-based smoothing loss is also implemented to encourage the network to create smooth semantic boundaries of lesions. The framework is evaluated on the International Skin Imaging Collaboration Lesion Dataset. It outperforms the current state-of-the-art skin lesion segmentation approaches with a Dice coefficient, Jaccard similarity, and accuracy of 90.1%, 83.6%, and 94.5%, respectively. We also design a lightweight segmentation framework called Mobile-GAN (MGAN) that achieves comparable performance as EGAN but with an order of magnitude lower number of training parameters, thus resulting in faster inference times for low compute resource settings.
format Online
Article
Text
id pubmed-10439152
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-104391522023-08-20 Generative adversarial networks based skin lesion segmentation Innani, Shubham Dutande, Prasad Baid, Ujjwal Pokuri, Venu Bakas, Spyridon Talbar, Sanjay Baheti, Bhakti Guntuku, Sharath Chandra Sci Rep Article Skin cancer is a serious condition that requires accurate diagnosis and treatment. One way to assist clinicians in this task is using computer-aided diagnosis tools that automatically segment skin lesions from dermoscopic images. We propose a novel adversarial learning-based framework called Efficient-GAN (EGAN) that uses an unsupervised generative network to generate accurate lesion masks. It consists of a generator module with a top-down squeeze excitation-based compound scaled path, an asymmetric lateral connection-based bottom-up path, and a discriminator module that distinguishes between original and synthetic masks. A morphology-based smoothing loss is also implemented to encourage the network to create smooth semantic boundaries of lesions. The framework is evaluated on the International Skin Imaging Collaboration Lesion Dataset. It outperforms the current state-of-the-art skin lesion segmentation approaches with a Dice coefficient, Jaccard similarity, and accuracy of 90.1%, 83.6%, and 94.5%, respectively. We also design a lightweight segmentation framework called Mobile-GAN (MGAN) that achieves comparable performance as EGAN but with an order of magnitude lower number of training parameters, thus resulting in faster inference times for low compute resource settings. Nature Publishing Group UK 2023-08-18 /pmc/articles/PMC10439152/ /pubmed/37596306 http://dx.doi.org/10.1038/s41598-023-39648-8 Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Innani, Shubham
Dutande, Prasad
Baid, Ujjwal
Pokuri, Venu
Bakas, Spyridon
Talbar, Sanjay
Baheti, Bhakti
Guntuku, Sharath Chandra
Generative adversarial networks based skin lesion segmentation
title Generative adversarial networks based skin lesion segmentation
title_full Generative adversarial networks based skin lesion segmentation
title_fullStr Generative adversarial networks based skin lesion segmentation
title_full_unstemmed Generative adversarial networks based skin lesion segmentation
title_short Generative adversarial networks based skin lesion segmentation
title_sort generative adversarial networks based skin lesion segmentation
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10439152/
https://www.ncbi.nlm.nih.gov/pubmed/37596306
http://dx.doi.org/10.1038/s41598-023-39648-8
work_keys_str_mv AT innanishubham generativeadversarialnetworksbasedskinlesionsegmentation
AT dutandeprasad generativeadversarialnetworksbasedskinlesionsegmentation
AT baidujjwal generativeadversarialnetworksbasedskinlesionsegmentation
AT pokurivenu generativeadversarialnetworksbasedskinlesionsegmentation
AT bakasspyridon generativeadversarialnetworksbasedskinlesionsegmentation
AT talbarsanjay generativeadversarialnetworksbasedskinlesionsegmentation
AT bahetibhakti generativeadversarialnetworksbasedskinlesionsegmentation
AT guntukusharathchandra generativeadversarialnetworksbasedskinlesionsegmentation