Cargando…

A Three-Threshold Learning Rule Approaches the Maximal Capacity of Recurrent Neural Networks

Understanding the theoretical foundations of how memories are encoded and retrieved in neural populations is a central challenge in neuroscience. A popular theoretical scenario for modeling memory function is the attractor neural network scenario, whose prototype is the Hopfield model. The model sim...

Descripción completa

Detalles Bibliográficos
Autores principales: Alemi, Alireza, Baldassi, Carlo, Brunel, Nicolas, Zecchina, Riccardo
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2015
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4546407/
https://www.ncbi.nlm.nih.gov/pubmed/26291608
http://dx.doi.org/10.1371/journal.pcbi.1004439
_version_ 1782386922265509888
author Alemi, Alireza
Baldassi, Carlo
Brunel, Nicolas
Zecchina, Riccardo
author_facet Alemi, Alireza
Baldassi, Carlo
Brunel, Nicolas
Zecchina, Riccardo
author_sort Alemi, Alireza
collection PubMed
description Understanding the theoretical foundations of how memories are encoded and retrieved in neural populations is a central challenge in neuroscience. A popular theoretical scenario for modeling memory function is the attractor neural network scenario, whose prototype is the Hopfield model. The model simplicity and the locality of the synaptic update rules come at the cost of a poor storage capacity, compared with the capacity achieved with perceptron learning algorithms. Here, by transforming the perceptron learning rule, we present an online learning rule for a recurrent neural network that achieves near-maximal storage capacity without an explicit supervisory error signal, relying only upon locally accessible information. The fully-connected network consists of excitatory binary neurons with plastic recurrent connections and non-plastic inhibitory feedback stabilizing the network dynamics; the memory patterns to be memorized are presented online as strong afferent currents, producing a bimodal distribution for the neuron synaptic inputs. Synapses corresponding to active inputs are modified as a function of the value of the local fields with respect to three thresholds. Above the highest threshold, and below the lowest threshold, no plasticity occurs. In between these two thresholds, potentiation/depression occurs when the local field is above/below an intermediate threshold. We simulated and analyzed a network of binary neurons implementing this rule and measured its storage capacity for different sizes of the basins of attraction. The storage capacity obtained through numerical simulations is shown to be close to the value predicted by analytical calculations. We also measured the dependence of capacity on the strength of external inputs. Finally, we quantified the statistics of the resulting synaptic connectivity matrix, and found that both the fraction of zero weight synapses and the degree of symmetry of the weight matrix increase with the number of stored patterns.
format Online
Article
Text
id pubmed-4546407
institution National Center for Biotechnology Information
language English
publishDate 2015
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-45464072015-08-26 A Three-Threshold Learning Rule Approaches the Maximal Capacity of Recurrent Neural Networks Alemi, Alireza Baldassi, Carlo Brunel, Nicolas Zecchina, Riccardo PLoS Comput Biol Research Article Understanding the theoretical foundations of how memories are encoded and retrieved in neural populations is a central challenge in neuroscience. A popular theoretical scenario for modeling memory function is the attractor neural network scenario, whose prototype is the Hopfield model. The model simplicity and the locality of the synaptic update rules come at the cost of a poor storage capacity, compared with the capacity achieved with perceptron learning algorithms. Here, by transforming the perceptron learning rule, we present an online learning rule for a recurrent neural network that achieves near-maximal storage capacity without an explicit supervisory error signal, relying only upon locally accessible information. The fully-connected network consists of excitatory binary neurons with plastic recurrent connections and non-plastic inhibitory feedback stabilizing the network dynamics; the memory patterns to be memorized are presented online as strong afferent currents, producing a bimodal distribution for the neuron synaptic inputs. Synapses corresponding to active inputs are modified as a function of the value of the local fields with respect to three thresholds. Above the highest threshold, and below the lowest threshold, no plasticity occurs. In between these two thresholds, potentiation/depression occurs when the local field is above/below an intermediate threshold. We simulated and analyzed a network of binary neurons implementing this rule and measured its storage capacity for different sizes of the basins of attraction. The storage capacity obtained through numerical simulations is shown to be close to the value predicted by analytical calculations. We also measured the dependence of capacity on the strength of external inputs. Finally, we quantified the statistics of the resulting synaptic connectivity matrix, and found that both the fraction of zero weight synapses and the degree of symmetry of the weight matrix increase with the number of stored patterns. Public Library of Science 2015-08-20 /pmc/articles/PMC4546407/ /pubmed/26291608 http://dx.doi.org/10.1371/journal.pcbi.1004439 Text en © 2015 Alemi et al http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly credited.
spellingShingle Research Article
Alemi, Alireza
Baldassi, Carlo
Brunel, Nicolas
Zecchina, Riccardo
A Three-Threshold Learning Rule Approaches the Maximal Capacity of Recurrent Neural Networks
title A Three-Threshold Learning Rule Approaches the Maximal Capacity of Recurrent Neural Networks
title_full A Three-Threshold Learning Rule Approaches the Maximal Capacity of Recurrent Neural Networks
title_fullStr A Three-Threshold Learning Rule Approaches the Maximal Capacity of Recurrent Neural Networks
title_full_unstemmed A Three-Threshold Learning Rule Approaches the Maximal Capacity of Recurrent Neural Networks
title_short A Three-Threshold Learning Rule Approaches the Maximal Capacity of Recurrent Neural Networks
title_sort three-threshold learning rule approaches the maximal capacity of recurrent neural networks
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4546407/
https://www.ncbi.nlm.nih.gov/pubmed/26291608
http://dx.doi.org/10.1371/journal.pcbi.1004439
work_keys_str_mv AT alemialireza athreethresholdlearningruleapproachesthemaximalcapacityofrecurrentneuralnetworks
AT baldassicarlo athreethresholdlearningruleapproachesthemaximalcapacityofrecurrentneuralnetworks
AT brunelnicolas athreethresholdlearningruleapproachesthemaximalcapacityofrecurrentneuralnetworks
AT zecchinariccardo athreethresholdlearningruleapproachesthemaximalcapacityofrecurrentneuralnetworks
AT alemialireza threethresholdlearningruleapproachesthemaximalcapacityofrecurrentneuralnetworks
AT baldassicarlo threethresholdlearningruleapproachesthemaximalcapacityofrecurrentneuralnetworks
AT brunelnicolas threethresholdlearningruleapproachesthemaximalcapacityofrecurrentneuralnetworks
AT zecchinariccardo threethresholdlearningruleapproachesthemaximalcapacityofrecurrentneuralnetworks