Cargando…

Explanation-Driven Deep Learning Model for Prediction of Brain Tumour Status Using MRI Image Data

Cancer research has seen explosive development exploring deep learning (DL) techniques for analysing magnetic resonance imaging (MRI) images for predicting brain tumours. We have observed a substantial gap in explanation, interpretability, and high accuracy for DL models. Consequently, we propose an...

Descripción completa

Detalles Bibliográficos
Autores principales: Gaur, Loveleen, Bhandari, Mohan, Razdan, Tanvi, Mallik, Saurav, Zhao, Zhongming
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8964286/
https://www.ncbi.nlm.nih.gov/pubmed/35360838
http://dx.doi.org/10.3389/fgene.2022.822666
_version_ 1784678180674600960
author Gaur, Loveleen
Bhandari, Mohan
Razdan, Tanvi
Mallik, Saurav
Zhao, Zhongming
author_facet Gaur, Loveleen
Bhandari, Mohan
Razdan, Tanvi
Mallik, Saurav
Zhao, Zhongming
author_sort Gaur, Loveleen
collection PubMed
description Cancer research has seen explosive development exploring deep learning (DL) techniques for analysing magnetic resonance imaging (MRI) images for predicting brain tumours. We have observed a substantial gap in explanation, interpretability, and high accuracy for DL models. Consequently, we propose an explanation-driven DL model by utilising a convolutional neural network (CNN), local interpretable model-agnostic explanation (LIME), and Shapley additive explanation (SHAP) for the prediction of discrete subtypes of brain tumours (meningioma, glioma, and pituitary) using an MRI image dataset. Unlike previous models, our model used a dual-input CNN approach to prevail over the classification challenge with images of inferior quality in terms of noise and metal artifacts by adding Gaussian noise. Our CNN training results reveal 94.64% accuracy as compared to other state-of-the-art methods. We used SHAP to ensure consistency and local accuracy for interpretation as Shapley values examine all future predictions applying all possible combinations of inputs. In contrast, LIME constructs sparse linear models around each prediction to illustrate how the model operates in the immediate area. Our emphasis for this study is interpretability and high accuracy, which is critical for realising disparities in predictive performance, helpful in developing trust, and essential in integration into clinical practice. The proposed method has a vast clinical application that could potentially be used for mass screening in resource-constraint countries.
format Online
Article
Text
id pubmed-8964286
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-89642862022-03-30 Explanation-Driven Deep Learning Model for Prediction of Brain Tumour Status Using MRI Image Data Gaur, Loveleen Bhandari, Mohan Razdan, Tanvi Mallik, Saurav Zhao, Zhongming Front Genet Genetics Cancer research has seen explosive development exploring deep learning (DL) techniques for analysing magnetic resonance imaging (MRI) images for predicting brain tumours. We have observed a substantial gap in explanation, interpretability, and high accuracy for DL models. Consequently, we propose an explanation-driven DL model by utilising a convolutional neural network (CNN), local interpretable model-agnostic explanation (LIME), and Shapley additive explanation (SHAP) for the prediction of discrete subtypes of brain tumours (meningioma, glioma, and pituitary) using an MRI image dataset. Unlike previous models, our model used a dual-input CNN approach to prevail over the classification challenge with images of inferior quality in terms of noise and metal artifacts by adding Gaussian noise. Our CNN training results reveal 94.64% accuracy as compared to other state-of-the-art methods. We used SHAP to ensure consistency and local accuracy for interpretation as Shapley values examine all future predictions applying all possible combinations of inputs. In contrast, LIME constructs sparse linear models around each prediction to illustrate how the model operates in the immediate area. Our emphasis for this study is interpretability and high accuracy, which is critical for realising disparities in predictive performance, helpful in developing trust, and essential in integration into clinical practice. The proposed method has a vast clinical application that could potentially be used for mass screening in resource-constraint countries. Frontiers Media S.A. 2022-03-14 /pmc/articles/PMC8964286/ /pubmed/35360838 http://dx.doi.org/10.3389/fgene.2022.822666 Text en Copyright © 2022 Gaur, Bhandari, Razdan, Mallik and Zhao. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Genetics
Gaur, Loveleen
Bhandari, Mohan
Razdan, Tanvi
Mallik, Saurav
Zhao, Zhongming
Explanation-Driven Deep Learning Model for Prediction of Brain Tumour Status Using MRI Image Data
title Explanation-Driven Deep Learning Model for Prediction of Brain Tumour Status Using MRI Image Data
title_full Explanation-Driven Deep Learning Model for Prediction of Brain Tumour Status Using MRI Image Data
title_fullStr Explanation-Driven Deep Learning Model for Prediction of Brain Tumour Status Using MRI Image Data
title_full_unstemmed Explanation-Driven Deep Learning Model for Prediction of Brain Tumour Status Using MRI Image Data
title_short Explanation-Driven Deep Learning Model for Prediction of Brain Tumour Status Using MRI Image Data
title_sort explanation-driven deep learning model for prediction of brain tumour status using mri image data
topic Genetics
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8964286/
https://www.ncbi.nlm.nih.gov/pubmed/35360838
http://dx.doi.org/10.3389/fgene.2022.822666
work_keys_str_mv AT gaurloveleen explanationdrivendeeplearningmodelforpredictionofbraintumourstatususingmriimagedata
AT bhandarimohan explanationdrivendeeplearningmodelforpredictionofbraintumourstatususingmriimagedata
AT razdantanvi explanationdrivendeeplearningmodelforpredictionofbraintumourstatususingmriimagedata
AT malliksaurav explanationdrivendeeplearningmodelforpredictionofbraintumourstatususingmriimagedata
AT zhaozhongming explanationdrivendeeplearningmodelforpredictionofbraintumourstatususingmriimagedata