Cargando…

Information Entropy Measures for Evaluation of Reliability of Deep Neural Network Results

Deep neural networks (DNN) try to analyze given data, to come up with decisions regarding the inputs. The decision-making process of the DNN model is not entirely transparent. The confidence of the model predictions on new data fed into the network can vary. We address the question of certainty of d...

Descripción completa

Detalles Bibliográficos
Autores principales: Gireesh, Elakkat D., Gurupur, Varadaraj P.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10137523/
https://www.ncbi.nlm.nih.gov/pubmed/37190360
http://dx.doi.org/10.3390/e25040573
_version_ 1785032485516607488
author Gireesh, Elakkat D.
Gurupur, Varadaraj P.
author_facet Gireesh, Elakkat D.
Gurupur, Varadaraj P.
author_sort Gireesh, Elakkat D.
collection PubMed
description Deep neural networks (DNN) try to analyze given data, to come up with decisions regarding the inputs. The decision-making process of the DNN model is not entirely transparent. The confidence of the model predictions on new data fed into the network can vary. We address the question of certainty of decision making and adequacy of information capturing by DNN models during this process of decision-making. We introduce a measure called certainty index, which is based on the outputs in the most penultimate layer of DNN. In this approach, we employed iEEG (intracranial electroencephalogram) data to train and test DNN. When arriving at model predictions, the contribution of the entire information content of the input may be important. We explored the relationship between the certainty of DNN predictions and information content of the signal by estimating the sample entropy and using a heatmap of the signal. While it can be assumed that the entire sample must be utilized for arriving at the most appropriate decisions, an evaluation of DNNs from this standpoint has not been reported. We demonstrate that the robustness of the relationship between certainty index with the sample entropy, demonstrated through sample entropy-heatmap correlation, is higher than that with the original signal, indicating that the DNN focuses on information rich regions of the signal to arrive at decisions. Therefore, it can be concluded that the certainty of a decision is related to the DNN’s ability to capture the information in the original signal. Our results indicate that, within its limitations, the certainty index can be used as useful tool in estimating the confidence of predictions. The certainty index appears to be related to how effectively DNN heatmaps captured the information content in the signal.
format Online
Article
Text
id pubmed-10137523
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-101375232023-04-28 Information Entropy Measures for Evaluation of Reliability of Deep Neural Network Results Gireesh, Elakkat D. Gurupur, Varadaraj P. Entropy (Basel) Article Deep neural networks (DNN) try to analyze given data, to come up with decisions regarding the inputs. The decision-making process of the DNN model is not entirely transparent. The confidence of the model predictions on new data fed into the network can vary. We address the question of certainty of decision making and adequacy of information capturing by DNN models during this process of decision-making. We introduce a measure called certainty index, which is based on the outputs in the most penultimate layer of DNN. In this approach, we employed iEEG (intracranial electroencephalogram) data to train and test DNN. When arriving at model predictions, the contribution of the entire information content of the input may be important. We explored the relationship between the certainty of DNN predictions and information content of the signal by estimating the sample entropy and using a heatmap of the signal. While it can be assumed that the entire sample must be utilized for arriving at the most appropriate decisions, an evaluation of DNNs from this standpoint has not been reported. We demonstrate that the robustness of the relationship between certainty index with the sample entropy, demonstrated through sample entropy-heatmap correlation, is higher than that with the original signal, indicating that the DNN focuses on information rich regions of the signal to arrive at decisions. Therefore, it can be concluded that the certainty of a decision is related to the DNN’s ability to capture the information in the original signal. Our results indicate that, within its limitations, the certainty index can be used as useful tool in estimating the confidence of predictions. The certainty index appears to be related to how effectively DNN heatmaps captured the information content in the signal. MDPI 2023-03-27 /pmc/articles/PMC10137523/ /pubmed/37190360 http://dx.doi.org/10.3390/e25040573 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Gireesh, Elakkat D.
Gurupur, Varadaraj P.
Information Entropy Measures for Evaluation of Reliability of Deep Neural Network Results
title Information Entropy Measures for Evaluation of Reliability of Deep Neural Network Results
title_full Information Entropy Measures for Evaluation of Reliability of Deep Neural Network Results
title_fullStr Information Entropy Measures for Evaluation of Reliability of Deep Neural Network Results
title_full_unstemmed Information Entropy Measures for Evaluation of Reliability of Deep Neural Network Results
title_short Information Entropy Measures for Evaluation of Reliability of Deep Neural Network Results
title_sort information entropy measures for evaluation of reliability of deep neural network results
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10137523/
https://www.ncbi.nlm.nih.gov/pubmed/37190360
http://dx.doi.org/10.3390/e25040573
work_keys_str_mv AT gireeshelakkatd informationentropymeasuresforevaluationofreliabilityofdeepneuralnetworkresults
AT gurupurvaradarajp informationentropymeasuresforevaluationofreliabilityofdeepneuralnetworkresults