Cargando…

An IoMT-Based Melanoma Lesion Segmentation Using Conditional Generative Adversarial Networks

Currently, Internet of medical things-based technologies provide a foundation for remote data collection and medical assistance for various diseases. Along with developments in computer vision, the application of Artificial Intelligence and Deep Learning in IOMT devices aids in the design of effecti...

Descripción completa

Detalles Bibliográficos
Autores principales: Ali, Zeeshan, Naz, Sheneela, Zaffar, Hira, Choi, Jaeun, Kim, Yongsung
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10098854/
https://www.ncbi.nlm.nih.gov/pubmed/37050607
http://dx.doi.org/10.3390/s23073548
Descripción
Sumario:Currently, Internet of medical things-based technologies provide a foundation for remote data collection and medical assistance for various diseases. Along with developments in computer vision, the application of Artificial Intelligence and Deep Learning in IOMT devices aids in the design of effective CAD systems for various diseases such as melanoma cancer even in the absence of experts. However, accurate segmentation of melanoma skin lesions from images by CAD systems is necessary to carry out an effective diagnosis. Nevertheless, the visual similarity between normal and melanoma lesions is very high, which leads to less accuracy of various traditional, parametric, and deep learning-based methods. Hence, as a solution to the challenge of accurate segmentation, we propose an advanced generative deep learning model called the Conditional Generative Adversarial Network (cGAN) for lesion segmentation. In the suggested technique, the generation of segmented images is conditional on dermoscopic images of skin lesions to generate accurate segmentation. We assessed the proposed model using three distinct datasets including DermQuest, DermIS, and ISCI2016, and attained optimal segmentation results of 99%, 97%, and 95% performance accuracy, respectively.