Cargando…
Distributed Bayesian Computation and Self-Organized Learning in Sheets of Spiking Neurons with Local Lateral Inhibition
During the last decade, Bayesian probability theory has emerged as a framework in cognitive science and neuroscience for describing perception, reasoning and learning of mammals. However, our understanding of how probabilistic computations could be organized in the brain, and how the observed connec...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2015
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4540468/ https://www.ncbi.nlm.nih.gov/pubmed/26284370 http://dx.doi.org/10.1371/journal.pone.0134356 |
_version_ | 1782386249117466624 |
---|---|
author | Bill, Johannes Buesing, Lars Habenschuss, Stefan Nessler, Bernhard Maass, Wolfgang Legenstein, Robert |
author_facet | Bill, Johannes Buesing, Lars Habenschuss, Stefan Nessler, Bernhard Maass, Wolfgang Legenstein, Robert |
author_sort | Bill, Johannes |
collection | PubMed |
description | During the last decade, Bayesian probability theory has emerged as a framework in cognitive science and neuroscience for describing perception, reasoning and learning of mammals. However, our understanding of how probabilistic computations could be organized in the brain, and how the observed connectivity structure of cortical microcircuits supports these calculations, is rudimentary at best. In this study, we investigate statistical inference and self-organized learning in a spatially extended spiking network model, that accommodates both local competitive and large-scale associative aspects of neural information processing, under a unified Bayesian account. Specifically, we show how the spiking dynamics of a recurrent network with lateral excitation and local inhibition in response to distributed spiking input, can be understood as sampling from a variational posterior distribution of a well-defined implicit probabilistic model. This interpretation further permits a rigorous analytical treatment of experience-dependent plasticity on the network level. Using machine learning theory, we derive update rules for neuron and synapse parameters which equate with Hebbian synaptic and homeostatic intrinsic plasticity rules in a neural implementation. In computer simulations, we demonstrate that the interplay of these plasticity rules leads to the emergence of probabilistic local experts that form distributed assemblies of similarly tuned cells communicating through lateral excitatory connections. The resulting sparse distributed spike code of a well-adapted network carries compressed information on salient input features combined with prior experience on correlations among them. Our theory predicts that the emergence of such efficient representations benefits from network architectures in which the range of local inhibition matches the spatial extent of pyramidal cells that share common afferent input. |
format | Online Article Text |
id | pubmed-4540468 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2015 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-45404682015-08-24 Distributed Bayesian Computation and Self-Organized Learning in Sheets of Spiking Neurons with Local Lateral Inhibition Bill, Johannes Buesing, Lars Habenschuss, Stefan Nessler, Bernhard Maass, Wolfgang Legenstein, Robert PLoS One Research Article During the last decade, Bayesian probability theory has emerged as a framework in cognitive science and neuroscience for describing perception, reasoning and learning of mammals. However, our understanding of how probabilistic computations could be organized in the brain, and how the observed connectivity structure of cortical microcircuits supports these calculations, is rudimentary at best. In this study, we investigate statistical inference and self-organized learning in a spatially extended spiking network model, that accommodates both local competitive and large-scale associative aspects of neural information processing, under a unified Bayesian account. Specifically, we show how the spiking dynamics of a recurrent network with lateral excitation and local inhibition in response to distributed spiking input, can be understood as sampling from a variational posterior distribution of a well-defined implicit probabilistic model. This interpretation further permits a rigorous analytical treatment of experience-dependent plasticity on the network level. Using machine learning theory, we derive update rules for neuron and synapse parameters which equate with Hebbian synaptic and homeostatic intrinsic plasticity rules in a neural implementation. In computer simulations, we demonstrate that the interplay of these plasticity rules leads to the emergence of probabilistic local experts that form distributed assemblies of similarly tuned cells communicating through lateral excitatory connections. The resulting sparse distributed spike code of a well-adapted network carries compressed information on salient input features combined with prior experience on correlations among them. Our theory predicts that the emergence of such efficient representations benefits from network architectures in which the range of local inhibition matches the spatial extent of pyramidal cells that share common afferent input. Public Library of Science 2015-08-18 /pmc/articles/PMC4540468/ /pubmed/26284370 http://dx.doi.org/10.1371/journal.pone.0134356 Text en © 2015 Bill et al http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly credited. |
spellingShingle | Research Article Bill, Johannes Buesing, Lars Habenschuss, Stefan Nessler, Bernhard Maass, Wolfgang Legenstein, Robert Distributed Bayesian Computation and Self-Organized Learning in Sheets of Spiking Neurons with Local Lateral Inhibition |
title | Distributed Bayesian Computation and Self-Organized Learning in Sheets of Spiking Neurons with Local Lateral Inhibition |
title_full | Distributed Bayesian Computation and Self-Organized Learning in Sheets of Spiking Neurons with Local Lateral Inhibition |
title_fullStr | Distributed Bayesian Computation and Self-Organized Learning in Sheets of Spiking Neurons with Local Lateral Inhibition |
title_full_unstemmed | Distributed Bayesian Computation and Self-Organized Learning in Sheets of Spiking Neurons with Local Lateral Inhibition |
title_short | Distributed Bayesian Computation and Self-Organized Learning in Sheets of Spiking Neurons with Local Lateral Inhibition |
title_sort | distributed bayesian computation and self-organized learning in sheets of spiking neurons with local lateral inhibition |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4540468/ https://www.ncbi.nlm.nih.gov/pubmed/26284370 http://dx.doi.org/10.1371/journal.pone.0134356 |
work_keys_str_mv | AT billjohannes distributedbayesiancomputationandselforganizedlearninginsheetsofspikingneuronswithlocallateralinhibition AT buesinglars distributedbayesiancomputationandselforganizedlearninginsheetsofspikingneuronswithlocallateralinhibition AT habenschussstefan distributedbayesiancomputationandselforganizedlearninginsheetsofspikingneuronswithlocallateralinhibition AT nesslerbernhard distributedbayesiancomputationandselforganizedlearninginsheetsofspikingneuronswithlocallateralinhibition AT maasswolfgang distributedbayesiancomputationandselforganizedlearninginsheetsofspikingneuronswithlocallateralinhibition AT legensteinrobert distributedbayesiancomputationandselforganizedlearninginsheetsofspikingneuronswithlocallateralinhibition |