Cargando…
Detection of COVID-19 in X-ray Images Using Densely Connected Squeeze Convolutional Neural Network (DCSCNN): Focusing on Interpretability and Explainability of the Black Box Model
The novel coronavirus (COVID-19), which emerged as a pandemic, has engulfed so many lives and affected millions of people across the world since December 2019. Although this disease is under control nowadays, yet it is still affecting people in many countries. The traditional way of diagnosis is tim...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9781899/ https://www.ncbi.nlm.nih.gov/pubmed/36560352 http://dx.doi.org/10.3390/s22249983 |
_version_ | 1784857187534766080 |
---|---|
author | Ali, Sikandar Hussain, Ali Bhattacharjee, Subrata Athar, Ali , Abdullah Kim, Hee-Cheol |
author_facet | Ali, Sikandar Hussain, Ali Bhattacharjee, Subrata Athar, Ali , Abdullah Kim, Hee-Cheol |
author_sort | Ali, Sikandar |
collection | PubMed |
description | The novel coronavirus (COVID-19), which emerged as a pandemic, has engulfed so many lives and affected millions of people across the world since December 2019. Although this disease is under control nowadays, yet it is still affecting people in many countries. The traditional way of diagnosis is time taking, less efficient, and has a low rate of detection of this disease. Therefore, there is a need for an automatic system that expedites the diagnosis process while retaining its performance and accuracy. Artificial intelligence (AI) technologies such as machine learning (ML) and deep learning (DL) potentially provide powerful solutions to address this problem. In this study, a state-of-the-art CNN model densely connected squeeze convolutional neural network (DCSCNN) has been developed for the classification of X-ray images of COVID-19, pneumonia, normal, and lung opacity patients. Data were collected from different sources. We applied different preprocessing techniques to enhance the quality of images so that our model could learn accurately and give optimal performance. Moreover, the attention regions and decisions of the AI model were visualized using the Grad-CAM and LIME methods. The DCSCNN combines the strength of the Dense and Squeeze networks. In our experiment, seven kinds of classification have been performed, in which six are binary classifications (COVID vs. normal, COVID vs. lung opacity, lung opacity vs. normal, COVID vs. pneumonia, pneumonia vs. lung opacity, pneumonia vs. normal) and one is multiclass classification (COVID vs. pneumonia vs. lung opacity vs. normal). The main contributions of this paper are as follows. First, the development of the DCSNN model which is capable of performing binary classification as well as multiclass classification with excellent classification accuracy. Second, to ensure trust, transparency, and explainability of the model, we applied two popular Explainable AI techniques (XAI). i.e., Grad-CAM and LIME. These techniques helped to address the black-box nature of the model while improving the trust, transparency, and explainability of the model. Our proposed DCSCNN model achieved an accuracy of 98.8% for the classification of COVID-19 vs normal, followed by COVID-19 vs. lung opacity: 98.2%, lung opacity vs. normal: 97.2%, COVID-19 vs. pneumonia: 96.4%, pneumonia vs. lung opacity: 95.8%, pneumonia vs. normal: 97.4%, and lastly for multiclass classification of all the four classes i.e., COVID vs. pneumonia vs. lung opacity vs. normal: 94.7%, respectively. The DCSCNN model provides excellent classification performance consequently, helping doctors to diagnose diseases quickly and efficiently. |
format | Online Article Text |
id | pubmed-9781899 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-97818992022-12-24 Detection of COVID-19 in X-ray Images Using Densely Connected Squeeze Convolutional Neural Network (DCSCNN): Focusing on Interpretability and Explainability of the Black Box Model Ali, Sikandar Hussain, Ali Bhattacharjee, Subrata Athar, Ali , Abdullah Kim, Hee-Cheol Sensors (Basel) Article The novel coronavirus (COVID-19), which emerged as a pandemic, has engulfed so many lives and affected millions of people across the world since December 2019. Although this disease is under control nowadays, yet it is still affecting people in many countries. The traditional way of diagnosis is time taking, less efficient, and has a low rate of detection of this disease. Therefore, there is a need for an automatic system that expedites the diagnosis process while retaining its performance and accuracy. Artificial intelligence (AI) technologies such as machine learning (ML) and deep learning (DL) potentially provide powerful solutions to address this problem. In this study, a state-of-the-art CNN model densely connected squeeze convolutional neural network (DCSCNN) has been developed for the classification of X-ray images of COVID-19, pneumonia, normal, and lung opacity patients. Data were collected from different sources. We applied different preprocessing techniques to enhance the quality of images so that our model could learn accurately and give optimal performance. Moreover, the attention regions and decisions of the AI model were visualized using the Grad-CAM and LIME methods. The DCSCNN combines the strength of the Dense and Squeeze networks. In our experiment, seven kinds of classification have been performed, in which six are binary classifications (COVID vs. normal, COVID vs. lung opacity, lung opacity vs. normal, COVID vs. pneumonia, pneumonia vs. lung opacity, pneumonia vs. normal) and one is multiclass classification (COVID vs. pneumonia vs. lung opacity vs. normal). The main contributions of this paper are as follows. First, the development of the DCSNN model which is capable of performing binary classification as well as multiclass classification with excellent classification accuracy. Second, to ensure trust, transparency, and explainability of the model, we applied two popular Explainable AI techniques (XAI). i.e., Grad-CAM and LIME. These techniques helped to address the black-box nature of the model while improving the trust, transparency, and explainability of the model. Our proposed DCSCNN model achieved an accuracy of 98.8% for the classification of COVID-19 vs normal, followed by COVID-19 vs. lung opacity: 98.2%, lung opacity vs. normal: 97.2%, COVID-19 vs. pneumonia: 96.4%, pneumonia vs. lung opacity: 95.8%, pneumonia vs. normal: 97.4%, and lastly for multiclass classification of all the four classes i.e., COVID vs. pneumonia vs. lung opacity vs. normal: 94.7%, respectively. The DCSCNN model provides excellent classification performance consequently, helping doctors to diagnose diseases quickly and efficiently. MDPI 2022-12-18 /pmc/articles/PMC9781899/ /pubmed/36560352 http://dx.doi.org/10.3390/s22249983 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Ali, Sikandar Hussain, Ali Bhattacharjee, Subrata Athar, Ali , Abdullah Kim, Hee-Cheol Detection of COVID-19 in X-ray Images Using Densely Connected Squeeze Convolutional Neural Network (DCSCNN): Focusing on Interpretability and Explainability of the Black Box Model |
title | Detection of COVID-19 in X-ray Images Using Densely Connected Squeeze Convolutional Neural Network (DCSCNN): Focusing on Interpretability and Explainability of the Black Box Model |
title_full | Detection of COVID-19 in X-ray Images Using Densely Connected Squeeze Convolutional Neural Network (DCSCNN): Focusing on Interpretability and Explainability of the Black Box Model |
title_fullStr | Detection of COVID-19 in X-ray Images Using Densely Connected Squeeze Convolutional Neural Network (DCSCNN): Focusing on Interpretability and Explainability of the Black Box Model |
title_full_unstemmed | Detection of COVID-19 in X-ray Images Using Densely Connected Squeeze Convolutional Neural Network (DCSCNN): Focusing on Interpretability and Explainability of the Black Box Model |
title_short | Detection of COVID-19 in X-ray Images Using Densely Connected Squeeze Convolutional Neural Network (DCSCNN): Focusing on Interpretability and Explainability of the Black Box Model |
title_sort | detection of covid-19 in x-ray images using densely connected squeeze convolutional neural network (dcscnn): focusing on interpretability and explainability of the black box model |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9781899/ https://www.ncbi.nlm.nih.gov/pubmed/36560352 http://dx.doi.org/10.3390/s22249983 |
work_keys_str_mv | AT alisikandar detectionofcovid19inxrayimagesusingdenselyconnectedsqueezeconvolutionalneuralnetworkdcscnnfocusingoninterpretabilityandexplainabilityoftheblackboxmodel AT hussainali detectionofcovid19inxrayimagesusingdenselyconnectedsqueezeconvolutionalneuralnetworkdcscnnfocusingoninterpretabilityandexplainabilityoftheblackboxmodel AT bhattacharjeesubrata detectionofcovid19inxrayimagesusingdenselyconnectedsqueezeconvolutionalneuralnetworkdcscnnfocusingoninterpretabilityandexplainabilityoftheblackboxmodel AT atharali detectionofcovid19inxrayimagesusingdenselyconnectedsqueezeconvolutionalneuralnetworkdcscnnfocusingoninterpretabilityandexplainabilityoftheblackboxmodel AT abdullah detectionofcovid19inxrayimagesusingdenselyconnectedsqueezeconvolutionalneuralnetworkdcscnnfocusingoninterpretabilityandexplainabilityoftheblackboxmodel AT kimheecheol detectionofcovid19inxrayimagesusingdenselyconnectedsqueezeconvolutionalneuralnetworkdcscnnfocusingoninterpretabilityandexplainabilityoftheblackboxmodel |