Cargando…

On the Maximum Storage Capacity of the Hopfield Model

Recurrent neural networks (RNN) have traditionally been of great interest for their capacity to store memories. In past years, several works have been devoted to determine the maximum storage capacity of RNN, especially for the case of the Hopfield network, the most popular kind of RNN. Analyzing th...

Descripción completa

Detalles Bibliográficos
Autores principales: Folli, Viola, Leonetti, Marco, Ruocco, Giancarlo
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2017
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5222833/
https://www.ncbi.nlm.nih.gov/pubmed/28119595
http://dx.doi.org/10.3389/fncom.2016.00144
_version_ 1782493066149494784
author Folli, Viola
Leonetti, Marco
Ruocco, Giancarlo
author_facet Folli, Viola
Leonetti, Marco
Ruocco, Giancarlo
author_sort Folli, Viola
collection PubMed
description Recurrent neural networks (RNN) have traditionally been of great interest for their capacity to store memories. In past years, several works have been devoted to determine the maximum storage capacity of RNN, especially for the case of the Hopfield network, the most popular kind of RNN. Analyzing the thermodynamic limit of the statistical properties of the Hamiltonian corresponding to the Hopfield neural network, it has been shown in the literature that the retrieval errors diverge when the number of stored memory patterns (P) exceeds a fraction (≈ 14%) of the network size N. In this paper, we study the storage performance of a generalized Hopfield model, where the diagonal elements of the connection matrix are allowed to be different from zero. We investigate this model at finite N. We give an analytical expression for the number of retrieval errors and show that, by increasing the number of stored patterns over a certain threshold, the errors start to decrease and reach values below unit for P ≫ N. We demonstrate that the strongest trade-off between efficiency and effectiveness relies on the number of patterns (P) that are stored in the network by appropriately fixing the connection weights. When P≫N and the diagonal elements of the adjacency matrix are not forced to be zero, the optimal storage capacity is obtained with a number of stored memories much larger than previously reported. This theory paves the way to the design of RNN with high storage capacity and able to retrieve the desired pattern without distortions.
format Online
Article
Text
id pubmed-5222833
institution National Center for Biotechnology Information
language English
publishDate 2017
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-52228332017-01-24 On the Maximum Storage Capacity of the Hopfield Model Folli, Viola Leonetti, Marco Ruocco, Giancarlo Front Comput Neurosci Neuroscience Recurrent neural networks (RNN) have traditionally been of great interest for their capacity to store memories. In past years, several works have been devoted to determine the maximum storage capacity of RNN, especially for the case of the Hopfield network, the most popular kind of RNN. Analyzing the thermodynamic limit of the statistical properties of the Hamiltonian corresponding to the Hopfield neural network, it has been shown in the literature that the retrieval errors diverge when the number of stored memory patterns (P) exceeds a fraction (≈ 14%) of the network size N. In this paper, we study the storage performance of a generalized Hopfield model, where the diagonal elements of the connection matrix are allowed to be different from zero. We investigate this model at finite N. We give an analytical expression for the number of retrieval errors and show that, by increasing the number of stored patterns over a certain threshold, the errors start to decrease and reach values below unit for P ≫ N. We demonstrate that the strongest trade-off between efficiency and effectiveness relies on the number of patterns (P) that are stored in the network by appropriately fixing the connection weights. When P≫N and the diagonal elements of the adjacency matrix are not forced to be zero, the optimal storage capacity is obtained with a number of stored memories much larger than previously reported. This theory paves the way to the design of RNN with high storage capacity and able to retrieve the desired pattern without distortions. Frontiers Media S.A. 2017-01-10 /pmc/articles/PMC5222833/ /pubmed/28119595 http://dx.doi.org/10.3389/fncom.2016.00144 Text en Copyright © 2017 Folli, Leonetti and Ruocco. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Folli, Viola
Leonetti, Marco
Ruocco, Giancarlo
On the Maximum Storage Capacity of the Hopfield Model
title On the Maximum Storage Capacity of the Hopfield Model
title_full On the Maximum Storage Capacity of the Hopfield Model
title_fullStr On the Maximum Storage Capacity of the Hopfield Model
title_full_unstemmed On the Maximum Storage Capacity of the Hopfield Model
title_short On the Maximum Storage Capacity of the Hopfield Model
title_sort on the maximum storage capacity of the hopfield model
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5222833/
https://www.ncbi.nlm.nih.gov/pubmed/28119595
http://dx.doi.org/10.3389/fncom.2016.00144
work_keys_str_mv AT folliviola onthemaximumstoragecapacityofthehopfieldmodel
AT leonettimarco onthemaximumstoragecapacityofthehopfieldmodel
AT ruoccogiancarlo onthemaximumstoragecapacityofthehopfieldmodel