Cargando…

Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding

Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding. Asymptotic formulas based on Fisher information sometimes provide accurate approximations to the mutual information but thi...

Descripción completa

Detalles Bibliográficos
Autores principales: Huang, Wentao, Zhang, Kechen
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7514724/
https://www.ncbi.nlm.nih.gov/pubmed/33266958
http://dx.doi.org/10.3390/e21030243
_version_ 1783586654255054848
author Huang, Wentao
Zhang, Kechen
author_facet Huang, Wentao
Zhang, Kechen
author_sort Huang, Wentao
collection PubMed
description Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding. Asymptotic formulas based on Fisher information sometimes provide accurate approximations to the mutual information but this approach is restricted to continuous variables because the calculation of Fisher information requires derivatives with respect to the encoded variables. In this paper, we consider information-theoretic bounds and approximations of the mutual information based on Kullback-Leibler divergence and Rényi divergence. We propose several information metrics to approximate Shannon mutual information in the context of neural population coding. While our asymptotic formulas all work for discrete variables, one of them has consistent performance and high accuracy regardless of whether the encoded variables are discrete or continuous. We performed numerical simulations and confirmed that our approximation formulas were highly accurate for approximating the mutual information between the stimuli and the responses of a large neural population. These approximation formulas may potentially bring convenience to the applications of information theory to many practical and theoretical problems.
format Online
Article
Text
id pubmed-7514724
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-75147242020-11-09 Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding Huang, Wentao Zhang, Kechen Entropy (Basel) Article Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding. Asymptotic formulas based on Fisher information sometimes provide accurate approximations to the mutual information but this approach is restricted to continuous variables because the calculation of Fisher information requires derivatives with respect to the encoded variables. In this paper, we consider information-theoretic bounds and approximations of the mutual information based on Kullback-Leibler divergence and Rényi divergence. We propose several information metrics to approximate Shannon mutual information in the context of neural population coding. While our asymptotic formulas all work for discrete variables, one of them has consistent performance and high accuracy regardless of whether the encoded variables are discrete or continuous. We performed numerical simulations and confirmed that our approximation formulas were highly accurate for approximating the mutual information between the stimuli and the responses of a large neural population. These approximation formulas may potentially bring convenience to the applications of information theory to many practical and theoretical problems. MDPI 2019-03-04 /pmc/articles/PMC7514724/ /pubmed/33266958 http://dx.doi.org/10.3390/e21030243 Text en © 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Huang, Wentao
Zhang, Kechen
Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding
title Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding
title_full Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding
title_fullStr Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding
title_full_unstemmed Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding
title_short Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding
title_sort approximations of shannon mutual information for discrete variables with applications to neural population coding
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7514724/
https://www.ncbi.nlm.nih.gov/pubmed/33266958
http://dx.doi.org/10.3390/e21030243
work_keys_str_mv AT huangwentao approximationsofshannonmutualinformationfordiscretevariableswithapplicationstoneuralpopulationcoding
AT zhangkechen approximationsofshannonmutualinformationfordiscretevariableswithapplicationstoneuralpopulationcoding