Cargando…
Stimulus-dependent Maximum Entropy Models of Neural Population Codes
Neural populations encode information about their stimulus in a collective fashion, by joint activity patterns of spiking and silence. A full account of this mapping from stimulus to neural activity is given by the conditional probability distribution over neural codewords given the sensory input. F...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2013
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3597542/ https://www.ncbi.nlm.nih.gov/pubmed/23516339 http://dx.doi.org/10.1371/journal.pcbi.1002922 |
_version_ | 1782262645248753664 |
---|---|
author | Granot-Atedgi, Einat Tkačik, Gašper Segev, Ronen Schneidman, Elad |
author_facet | Granot-Atedgi, Einat Tkačik, Gašper Segev, Ronen Schneidman, Elad |
author_sort | Granot-Atedgi, Einat |
collection | PubMed |
description | Neural populations encode information about their stimulus in a collective fashion, by joint activity patterns of spiking and silence. A full account of this mapping from stimulus to neural activity is given by the conditional probability distribution over neural codewords given the sensory input. For large populations, direct sampling of these distributions is impossible, and so we must rely on constructing appropriate models. We show here that in a population of 100 retinal ganglion cells in the salamander retina responding to temporal white-noise stimuli, dependencies between cells play an important encoding role. We introduce the stimulus-dependent maximum entropy (SDME) model—a minimal extension of the canonical linear-nonlinear model of a single neuron, to a pairwise-coupled neural population. We find that the SDME model gives a more accurate account of single cell responses and in particular significantly outperforms uncoupled models in reproducing the distributions of population codewords emitted in response to a stimulus. We show how the SDME model, in conjunction with static maximum entropy models of population vocabulary, can be used to estimate information-theoretic quantities like average surprise and information transmission in a neural population. |
format | Online Article Text |
id | pubmed-3597542 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2013 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-35975422013-03-20 Stimulus-dependent Maximum Entropy Models of Neural Population Codes Granot-Atedgi, Einat Tkačik, Gašper Segev, Ronen Schneidman, Elad PLoS Comput Biol Research Article Neural populations encode information about their stimulus in a collective fashion, by joint activity patterns of spiking and silence. A full account of this mapping from stimulus to neural activity is given by the conditional probability distribution over neural codewords given the sensory input. For large populations, direct sampling of these distributions is impossible, and so we must rely on constructing appropriate models. We show here that in a population of 100 retinal ganglion cells in the salamander retina responding to temporal white-noise stimuli, dependencies between cells play an important encoding role. We introduce the stimulus-dependent maximum entropy (SDME) model—a minimal extension of the canonical linear-nonlinear model of a single neuron, to a pairwise-coupled neural population. We find that the SDME model gives a more accurate account of single cell responses and in particular significantly outperforms uncoupled models in reproducing the distributions of population codewords emitted in response to a stimulus. We show how the SDME model, in conjunction with static maximum entropy models of population vocabulary, can be used to estimate information-theoretic quantities like average surprise and information transmission in a neural population. Public Library of Science 2013-03-14 /pmc/articles/PMC3597542/ /pubmed/23516339 http://dx.doi.org/10.1371/journal.pcbi.1002922 Text en © 2013 Granot-Atedgi et al http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly credited. |
spellingShingle | Research Article Granot-Atedgi, Einat Tkačik, Gašper Segev, Ronen Schneidman, Elad Stimulus-dependent Maximum Entropy Models of Neural Population Codes |
title | Stimulus-dependent Maximum Entropy Models of Neural Population Codes |
title_full | Stimulus-dependent Maximum Entropy Models of Neural Population Codes |
title_fullStr | Stimulus-dependent Maximum Entropy Models of Neural Population Codes |
title_full_unstemmed | Stimulus-dependent Maximum Entropy Models of Neural Population Codes |
title_short | Stimulus-dependent Maximum Entropy Models of Neural Population Codes |
title_sort | stimulus-dependent maximum entropy models of neural population codes |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3597542/ https://www.ncbi.nlm.nih.gov/pubmed/23516339 http://dx.doi.org/10.1371/journal.pcbi.1002922 |
work_keys_str_mv | AT granotatedgieinat stimulusdependentmaximumentropymodelsofneuralpopulationcodes AT tkacikgasper stimulusdependentmaximumentropymodelsofneuralpopulationcodes AT segevronen stimulusdependentmaximumentropymodelsofneuralpopulationcodes AT schneidmanelad stimulusdependentmaximumentropymodelsofneuralpopulationcodes |