Cargando…

Python for Information Theoretic Analysis of Neural Data

Information theory, the mathematical theory of communication in the presence of noise, is playing an increasingly important role in modern quantitative neuroscience. It makes it possible to treat neural systems as stochastic communication channels and gain valuable, quantitative insights into their...

Descripción completa

Detalles Bibliográficos
Autores principales: Ince, Robin A. A., Petersen, Rasmus S., Swan, Daniel C., Panzeri, Stefano
Formato: Texto
Lenguaje:English
Publicado: Frontiers Research Foundation 2009
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2647335/
https://www.ncbi.nlm.nih.gov/pubmed/19242557
http://dx.doi.org/10.3389/neuro.11.004.2009
_version_ 1782164923063730176
author Ince, Robin A. A.
Petersen, Rasmus S.
Swan, Daniel C.
Panzeri, Stefano
author_facet Ince, Robin A. A.
Petersen, Rasmus S.
Swan, Daniel C.
Panzeri, Stefano
author_sort Ince, Robin A. A.
collection PubMed
description Information theory, the mathematical theory of communication in the presence of noise, is playing an increasingly important role in modern quantitative neuroscience. It makes it possible to treat neural systems as stochastic communication channels and gain valuable, quantitative insights into their sensory coding function. These techniques provide results on how neurons encode stimuli in a way which is independent of any specific assumptions on which part of the neuronal response is signal and which is noise, and they can be usefully applied even to highly non-linear systems where traditional techniques fail. In this article, we describe our work and experiences using Python for information theoretic analysis. We outline some of the algorithmic, statistical and numerical challenges in the computation of information theoretic quantities from neural data. In particular, we consider the problems arising from limited sampling bias and from calculation of maximum entropy distributions in the presence of constraints representing the effects of different orders of interaction in the system. We explain how and why using Python has allowed us to significantly improve the speed and domain of applicability of the information theoretic algorithms, allowing analysis of data sets characterized by larger numbers of variables. We also discuss how our use of Python is facilitating integration with collaborative databases and centralised computational resources.
format Text
id pubmed-2647335
institution National Center for Biotechnology Information
language English
publishDate 2009
publisher Frontiers Research Foundation
record_format MEDLINE/PubMed
spelling pubmed-26473352009-02-25 Python for Information Theoretic Analysis of Neural Data Ince, Robin A. A. Petersen, Rasmus S. Swan, Daniel C. Panzeri, Stefano Front Neuroinformatics Neuroscience Information theory, the mathematical theory of communication in the presence of noise, is playing an increasingly important role in modern quantitative neuroscience. It makes it possible to treat neural systems as stochastic communication channels and gain valuable, quantitative insights into their sensory coding function. These techniques provide results on how neurons encode stimuli in a way which is independent of any specific assumptions on which part of the neuronal response is signal and which is noise, and they can be usefully applied even to highly non-linear systems where traditional techniques fail. In this article, we describe our work and experiences using Python for information theoretic analysis. We outline some of the algorithmic, statistical and numerical challenges in the computation of information theoretic quantities from neural data. In particular, we consider the problems arising from limited sampling bias and from calculation of maximum entropy distributions in the presence of constraints representing the effects of different orders of interaction in the system. We explain how and why using Python has allowed us to significantly improve the speed and domain of applicability of the information theoretic algorithms, allowing analysis of data sets characterized by larger numbers of variables. We also discuss how our use of Python is facilitating integration with collaborative databases and centralised computational resources. Frontiers Research Foundation 2009-02-11 /pmc/articles/PMC2647335/ /pubmed/19242557 http://dx.doi.org/10.3389/neuro.11.004.2009 Text en Copyright © 2009 Ince, Petersen, Swan and Panzeri. http://www.frontiersin.org/licenseagreement This is an open-access article subject to an exclusive license agreement between the authors and the Frontiers Research Foundation, which permits unrestricted use, distribution, and reproduction in any medium, provided the original authors and source are credited.
spellingShingle Neuroscience
Ince, Robin A. A.
Petersen, Rasmus S.
Swan, Daniel C.
Panzeri, Stefano
Python for Information Theoretic Analysis of Neural Data
title Python for Information Theoretic Analysis of Neural Data
title_full Python for Information Theoretic Analysis of Neural Data
title_fullStr Python for Information Theoretic Analysis of Neural Data
title_full_unstemmed Python for Information Theoretic Analysis of Neural Data
title_short Python for Information Theoretic Analysis of Neural Data
title_sort python for information theoretic analysis of neural data
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2647335/
https://www.ncbi.nlm.nih.gov/pubmed/19242557
http://dx.doi.org/10.3389/neuro.11.004.2009
work_keys_str_mv AT incerobinaa pythonforinformationtheoreticanalysisofneuraldata
AT petersenrasmuss pythonforinformationtheoreticanalysisofneuraldata
AT swandanielc pythonforinformationtheoreticanalysisofneuraldata
AT panzeristefano pythonforinformationtheoreticanalysisofneuraldata