Cargando…

Memory Capacity of Networks with Stochastic Binary Synapses

In standard attractor neural network models, specific patterns of activity are stored in the synaptic matrix, so that they become fixed point attractors of the network dynamics. The storage capacity of such networks has been quantified in two ways: the maximal number of patterns that can be stored,...

Descripción completa

Detalles Bibliográficos
Autores principales: Dubreuil, Alexis M., Amit, Yali, Brunel, Nicolas
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2014
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4125071/
https://www.ncbi.nlm.nih.gov/pubmed/25101662
http://dx.doi.org/10.1371/journal.pcbi.1003727
_version_ 1782329718912057344
author Dubreuil, Alexis M.
Amit, Yali
Brunel, Nicolas
author_facet Dubreuil, Alexis M.
Amit, Yali
Brunel, Nicolas
author_sort Dubreuil, Alexis M.
collection PubMed
description In standard attractor neural network models, specific patterns of activity are stored in the synaptic matrix, so that they become fixed point attractors of the network dynamics. The storage capacity of such networks has been quantified in two ways: the maximal number of patterns that can be stored, and the stored information measured in bits per synapse. In this paper, we compute both quantities in fully connected networks of N binary neurons with binary synapses, storing patterns with coding level [Image: see text], in the large [Image: see text] and sparse coding limits ([Image: see text]). We also derive finite-size corrections that accurately reproduce the results of simulations in networks of tens of thousands of neurons. These methods are applied to three different scenarios: (1) the classic Willshaw model, (2) networks with stochastic learning in which patterns are shown only once (one shot learning), (3) networks with stochastic learning in which patterns are shown multiple times. The storage capacities are optimized over network parameters, which allows us to compare the performance of the different models. We show that finite-size effects strongly reduce the capacity, even for networks of realistic sizes. We discuss the implications of these results for memory storage in the hippocampus and cerebral cortex.
format Online
Article
Text
id pubmed-4125071
institution National Center for Biotechnology Information
language English
publishDate 2014
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-41250712014-08-12 Memory Capacity of Networks with Stochastic Binary Synapses Dubreuil, Alexis M. Amit, Yali Brunel, Nicolas PLoS Comput Biol Research Article In standard attractor neural network models, specific patterns of activity are stored in the synaptic matrix, so that they become fixed point attractors of the network dynamics. The storage capacity of such networks has been quantified in two ways: the maximal number of patterns that can be stored, and the stored information measured in bits per synapse. In this paper, we compute both quantities in fully connected networks of N binary neurons with binary synapses, storing patterns with coding level [Image: see text], in the large [Image: see text] and sparse coding limits ([Image: see text]). We also derive finite-size corrections that accurately reproduce the results of simulations in networks of tens of thousands of neurons. These methods are applied to three different scenarios: (1) the classic Willshaw model, (2) networks with stochastic learning in which patterns are shown only once (one shot learning), (3) networks with stochastic learning in which patterns are shown multiple times. The storage capacities are optimized over network parameters, which allows us to compare the performance of the different models. We show that finite-size effects strongly reduce the capacity, even for networks of realistic sizes. We discuss the implications of these results for memory storage in the hippocampus and cerebral cortex. Public Library of Science 2014-08-07 /pmc/articles/PMC4125071/ /pubmed/25101662 http://dx.doi.org/10.1371/journal.pcbi.1003727 Text en © 2014 Dubreuil et al http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly credited.
spellingShingle Research Article
Dubreuil, Alexis M.
Amit, Yali
Brunel, Nicolas
Memory Capacity of Networks with Stochastic Binary Synapses
title Memory Capacity of Networks with Stochastic Binary Synapses
title_full Memory Capacity of Networks with Stochastic Binary Synapses
title_fullStr Memory Capacity of Networks with Stochastic Binary Synapses
title_full_unstemmed Memory Capacity of Networks with Stochastic Binary Synapses
title_short Memory Capacity of Networks with Stochastic Binary Synapses
title_sort memory capacity of networks with stochastic binary synapses
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4125071/
https://www.ncbi.nlm.nih.gov/pubmed/25101662
http://dx.doi.org/10.1371/journal.pcbi.1003727
work_keys_str_mv AT dubreuilalexism memorycapacityofnetworkswithstochasticbinarysynapses
AT amityali memorycapacityofnetworkswithstochasticbinarysynapses
AT brunelnicolas memorycapacityofnetworkswithstochasticbinarysynapses