Cargando…

Explainability agreement between dermatologists and five visual explanations techniques in deep neural networks for melanoma AI classification

INTRODUCTION: The use of deep convolutional neural networks for analyzing skin lesion images has shown promising results. The identification of skin cancer by faster and less expensive means can lead to an early diagnosis, saving lives and avoiding treatment costs. However, to implement this technol...

Descripción completa

Detalles Bibliográficos
Autores principales: Giavina-Bianchi, Mara, Vitor, William Gois, Fornasiero de Paiva, Victor, Okita, Aline Lissa, Sousa, Raquel Machado, Machado, Birajara
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10513767/
https://www.ncbi.nlm.nih.gov/pubmed/37746081
http://dx.doi.org/10.3389/fmed.2023.1241484
_version_ 1785108589355991040
author Giavina-Bianchi, Mara
Vitor, William Gois
Fornasiero de Paiva, Victor
Okita, Aline Lissa
Sousa, Raquel Machado
Machado, Birajara
author_facet Giavina-Bianchi, Mara
Vitor, William Gois
Fornasiero de Paiva, Victor
Okita, Aline Lissa
Sousa, Raquel Machado
Machado, Birajara
author_sort Giavina-Bianchi, Mara
collection PubMed
description INTRODUCTION: The use of deep convolutional neural networks for analyzing skin lesion images has shown promising results. The identification of skin cancer by faster and less expensive means can lead to an early diagnosis, saving lives and avoiding treatment costs. However, to implement this technology in a clinical context, it is important for specialists to understand why a certain model makes a prediction; it must be explainable. Explainability techniques can be used to highlight the patterns of interest for a prediction. METHODS: Our goal was to test five different techniques: Grad-CAM, Grad-CAM++, Score-CAM, Eigen-CAM, and LIME, to analyze the agreement rate between features highlighted by the visual explanation maps to 3 important clinical criteria for melanoma classification: asymmetry, border irregularity, and color heterogeneity (ABC rule) in 100 melanoma images. Two dermatologists scored the visual maps and the clinical images using a semi-quantitative scale, and the results were compared. They also ranked their preferable techniques. RESULTS: We found that the techniques had different agreement rates and acceptance. In the overall analysis, Grad-CAM showed the best total+partial agreement rate (93.6%), followed by LIME (89.8%), Grad-CAM++ (88.0%), Eigen-CAM (86.4%), and Score-CAM (84.6%). Dermatologists ranked their favorite options: Grad-CAM and Grad-CAM++, followed by Score-CAM, LIME, and Eigen-CAM. DISCUSSION: Saliency maps are one of the few methods that can be used for visual explanations. The evaluation of explainability with humans is ideal to assess the understanding and applicability of these methods. Our results demonstrated that there is a significant agreement between clinical features used by dermatologists to diagnose melanomas and visual explanation techniques, especially Grad-Cam.
format Online
Article
Text
id pubmed-10513767
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-105137672023-09-22 Explainability agreement between dermatologists and five visual explanations techniques in deep neural networks for melanoma AI classification Giavina-Bianchi, Mara Vitor, William Gois Fornasiero de Paiva, Victor Okita, Aline Lissa Sousa, Raquel Machado Machado, Birajara Front Med (Lausanne) Medicine INTRODUCTION: The use of deep convolutional neural networks for analyzing skin lesion images has shown promising results. The identification of skin cancer by faster and less expensive means can lead to an early diagnosis, saving lives and avoiding treatment costs. However, to implement this technology in a clinical context, it is important for specialists to understand why a certain model makes a prediction; it must be explainable. Explainability techniques can be used to highlight the patterns of interest for a prediction. METHODS: Our goal was to test five different techniques: Grad-CAM, Grad-CAM++, Score-CAM, Eigen-CAM, and LIME, to analyze the agreement rate between features highlighted by the visual explanation maps to 3 important clinical criteria for melanoma classification: asymmetry, border irregularity, and color heterogeneity (ABC rule) in 100 melanoma images. Two dermatologists scored the visual maps and the clinical images using a semi-quantitative scale, and the results were compared. They also ranked their preferable techniques. RESULTS: We found that the techniques had different agreement rates and acceptance. In the overall analysis, Grad-CAM showed the best total+partial agreement rate (93.6%), followed by LIME (89.8%), Grad-CAM++ (88.0%), Eigen-CAM (86.4%), and Score-CAM (84.6%). Dermatologists ranked their favorite options: Grad-CAM and Grad-CAM++, followed by Score-CAM, LIME, and Eigen-CAM. DISCUSSION: Saliency maps are one of the few methods that can be used for visual explanations. The evaluation of explainability with humans is ideal to assess the understanding and applicability of these methods. Our results demonstrated that there is a significant agreement between clinical features used by dermatologists to diagnose melanomas and visual explanation techniques, especially Grad-Cam. Frontiers Media S.A. 2023-08-31 /pmc/articles/PMC10513767/ /pubmed/37746081 http://dx.doi.org/10.3389/fmed.2023.1241484 Text en Copyright © 2023 Giavina-Bianchi, Vitor, Fornasiero de Paiva, Okita, Sousa and Machado. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Medicine
Giavina-Bianchi, Mara
Vitor, William Gois
Fornasiero de Paiva, Victor
Okita, Aline Lissa
Sousa, Raquel Machado
Machado, Birajara
Explainability agreement between dermatologists and five visual explanations techniques in deep neural networks for melanoma AI classification
title Explainability agreement between dermatologists and five visual explanations techniques in deep neural networks for melanoma AI classification
title_full Explainability agreement between dermatologists and five visual explanations techniques in deep neural networks for melanoma AI classification
title_fullStr Explainability agreement between dermatologists and five visual explanations techniques in deep neural networks for melanoma AI classification
title_full_unstemmed Explainability agreement between dermatologists and five visual explanations techniques in deep neural networks for melanoma AI classification
title_short Explainability agreement between dermatologists and five visual explanations techniques in deep neural networks for melanoma AI classification
title_sort explainability agreement between dermatologists and five visual explanations techniques in deep neural networks for melanoma ai classification
topic Medicine
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10513767/
https://www.ncbi.nlm.nih.gov/pubmed/37746081
http://dx.doi.org/10.3389/fmed.2023.1241484
work_keys_str_mv AT giavinabianchimara explainabilityagreementbetweendermatologistsandfivevisualexplanationstechniquesindeepneuralnetworksformelanomaaiclassification
AT vitorwilliamgois explainabilityagreementbetweendermatologistsandfivevisualexplanationstechniquesindeepneuralnetworksformelanomaaiclassification
AT fornasierodepaivavictor explainabilityagreementbetweendermatologistsandfivevisualexplanationstechniquesindeepneuralnetworksformelanomaaiclassification
AT okitaalinelissa explainabilityagreementbetweendermatologistsandfivevisualexplanationstechniquesindeepneuralnetworksformelanomaaiclassification
AT sousaraquelmachado explainabilityagreementbetweendermatologistsandfivevisualexplanationstechniquesindeepneuralnetworksformelanomaaiclassification
AT machadobirajara explainabilityagreementbetweendermatologistsandfivevisualexplanationstechniquesindeepneuralnetworksformelanomaaiclassification