Cargando…

A Quick and Easy Way to Estimate Entropy and Mutual Information for Neuroscience

Calculations of entropy of a signal or mutual information between two variables are valuable analytical tools in the field of neuroscience. They can be applied to all types of data, capture non-linear interactions and are model independent. Yet the limited size and number of recordings one can colle...

Descripción completa

Detalles Bibliográficos
Autores principales: Zbili, Mickael, Rama, Sylvain
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8239197/
https://www.ncbi.nlm.nih.gov/pubmed/34211385
http://dx.doi.org/10.3389/fninf.2021.596443
_version_ 1783715027803439104
author Zbili, Mickael
Rama, Sylvain
author_facet Zbili, Mickael
Rama, Sylvain
author_sort Zbili, Mickael
collection PubMed
description Calculations of entropy of a signal or mutual information between two variables are valuable analytical tools in the field of neuroscience. They can be applied to all types of data, capture non-linear interactions and are model independent. Yet the limited size and number of recordings one can collect in a series of experiments makes their calculation highly prone to sampling bias. Mathematical methods to overcome this so-called “sampling disaster” exist, but require significant expertise, great time and computational costs. As such, there is a need for a simple, unbiased and computationally efficient tool for estimating the level of entropy and mutual information. In this article, we propose that application of entropy-encoding compression algorithms widely used in text and image compression fulfill these requirements. By simply saving the signal in PNG picture format and measuring the size of the file on the hard drive, we can estimate entropy changes through different conditions. Furthermore, with some simple modifications of the PNG file, we can also estimate the evolution of mutual information between a stimulus and the observed responses through different conditions. We first demonstrate the applicability of this method using white-noise-like signals. Then, while this method can be used in all kind of experimental conditions, we provide examples of its application in patch-clamp recordings, detection of place cells and histological data. Although this method does not give an absolute value of entropy or mutual information, it is mathematically correct, and its simplicity and broad use make it a powerful tool for their estimation through experiments.
format Online
Article
Text
id pubmed-8239197
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-82391972021-06-30 A Quick and Easy Way to Estimate Entropy and Mutual Information for Neuroscience Zbili, Mickael Rama, Sylvain Front Neuroinform Neuroscience Calculations of entropy of a signal or mutual information between two variables are valuable analytical tools in the field of neuroscience. They can be applied to all types of data, capture non-linear interactions and are model independent. Yet the limited size and number of recordings one can collect in a series of experiments makes their calculation highly prone to sampling bias. Mathematical methods to overcome this so-called “sampling disaster” exist, but require significant expertise, great time and computational costs. As such, there is a need for a simple, unbiased and computationally efficient tool for estimating the level of entropy and mutual information. In this article, we propose that application of entropy-encoding compression algorithms widely used in text and image compression fulfill these requirements. By simply saving the signal in PNG picture format and measuring the size of the file on the hard drive, we can estimate entropy changes through different conditions. Furthermore, with some simple modifications of the PNG file, we can also estimate the evolution of mutual information between a stimulus and the observed responses through different conditions. We first demonstrate the applicability of this method using white-noise-like signals. Then, while this method can be used in all kind of experimental conditions, we provide examples of its application in patch-clamp recordings, detection of place cells and histological data. Although this method does not give an absolute value of entropy or mutual information, it is mathematically correct, and its simplicity and broad use make it a powerful tool for their estimation through experiments. Frontiers Media S.A. 2021-06-15 /pmc/articles/PMC8239197/ /pubmed/34211385 http://dx.doi.org/10.3389/fninf.2021.596443 Text en Copyright © 2021 Zbili and Rama. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Zbili, Mickael
Rama, Sylvain
A Quick and Easy Way to Estimate Entropy and Mutual Information for Neuroscience
title A Quick and Easy Way to Estimate Entropy and Mutual Information for Neuroscience
title_full A Quick and Easy Way to Estimate Entropy and Mutual Information for Neuroscience
title_fullStr A Quick and Easy Way to Estimate Entropy and Mutual Information for Neuroscience
title_full_unstemmed A Quick and Easy Way to Estimate Entropy and Mutual Information for Neuroscience
title_short A Quick and Easy Way to Estimate Entropy and Mutual Information for Neuroscience
title_sort quick and easy way to estimate entropy and mutual information for neuroscience
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8239197/
https://www.ncbi.nlm.nih.gov/pubmed/34211385
http://dx.doi.org/10.3389/fninf.2021.596443
work_keys_str_mv AT zbilimickael aquickandeasywaytoestimateentropyandmutualinformationforneuroscience
AT ramasylvain aquickandeasywaytoestimateentropyandmutualinformationforneuroscience
AT zbilimickael quickandeasywaytoestimateentropyandmutualinformationforneuroscience
AT ramasylvain quickandeasywaytoestimateentropyandmutualinformationforneuroscience