Cargando…

Limitations to Estimating Mutual Information in Large Neural Populations

Information theory provides a powerful framework to analyse the representation of sensory stimuli in neural population activity. However, estimating the quantities involved such as entropy and mutual information from finite samples is notoriously hard and any direct estimate is known to be heavily b...

Descripción completa

Detalles Bibliográficos
Autores principales: Mölter, Jan, Goodhill, Geoffrey J.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7516973/
https://www.ncbi.nlm.nih.gov/pubmed/33286264
http://dx.doi.org/10.3390/e22040490
_version_ 1783587121602232320
author Mölter, Jan
Goodhill, Geoffrey J.
author_facet Mölter, Jan
Goodhill, Geoffrey J.
author_sort Mölter, Jan
collection PubMed
description Information theory provides a powerful framework to analyse the representation of sensory stimuli in neural population activity. However, estimating the quantities involved such as entropy and mutual information from finite samples is notoriously hard and any direct estimate is known to be heavily biased. This is especially true when considering large neural populations. We study a simple model of sensory processing and show through a combinatorial argument that, with high probability, for large neural populations any finite number of samples of neural activity in response to a set of stimuli is mutually distinct. As a consequence, the mutual information when estimated directly from empirical histograms will be equal to the stimulus entropy. Importantly, this is the case irrespective of the precise relation between stimulus and neural activity and corresponds to a maximal bias. This argument is general and applies to any application of information theory, where the state space is large and one relies on empirical histograms. Overall, this work highlights the need for alternative approaches for an information theoretic analysis when dealing with large neural populations.
format Online
Article
Text
id pubmed-7516973
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-75169732020-11-09 Limitations to Estimating Mutual Information in Large Neural Populations Mölter, Jan Goodhill, Geoffrey J. Entropy (Basel) Article Information theory provides a powerful framework to analyse the representation of sensory stimuli in neural population activity. However, estimating the quantities involved such as entropy and mutual information from finite samples is notoriously hard and any direct estimate is known to be heavily biased. This is especially true when considering large neural populations. We study a simple model of sensory processing and show through a combinatorial argument that, with high probability, for large neural populations any finite number of samples of neural activity in response to a set of stimuli is mutually distinct. As a consequence, the mutual information when estimated directly from empirical histograms will be equal to the stimulus entropy. Importantly, this is the case irrespective of the precise relation between stimulus and neural activity and corresponds to a maximal bias. This argument is general and applies to any application of information theory, where the state space is large and one relies on empirical histograms. Overall, this work highlights the need for alternative approaches for an information theoretic analysis when dealing with large neural populations. MDPI 2020-04-24 /pmc/articles/PMC7516973/ /pubmed/33286264 http://dx.doi.org/10.3390/e22040490 Text en © 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Mölter, Jan
Goodhill, Geoffrey J.
Limitations to Estimating Mutual Information in Large Neural Populations
title Limitations to Estimating Mutual Information in Large Neural Populations
title_full Limitations to Estimating Mutual Information in Large Neural Populations
title_fullStr Limitations to Estimating Mutual Information in Large Neural Populations
title_full_unstemmed Limitations to Estimating Mutual Information in Large Neural Populations
title_short Limitations to Estimating Mutual Information in Large Neural Populations
title_sort limitations to estimating mutual information in large neural populations
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7516973/
https://www.ncbi.nlm.nih.gov/pubmed/33286264
http://dx.doi.org/10.3390/e22040490
work_keys_str_mv AT molterjan limitationstoestimatingmutualinformationinlargeneuralpopulations
AT goodhillgeoffreyj limitationstoestimatingmutualinformationinlargeneuralpopulations