Cargando…

Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks

In a neural network, an autapse is a particular kind of synapse that links a neuron onto itself. Autapses are almost always not allowed neither in artificial nor in biological neural networks. Moreover, redundant or similar stored states tend to interact destructively. This paper shows how autapses...

Descripción completa

Detalles Bibliográficos
Autores principales: Gosti, Giorgio, Folli, Viola, Leonetti, Marco, Ruocco, Giancarlo
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7515255/
https://www.ncbi.nlm.nih.gov/pubmed/33267440
http://dx.doi.org/10.3390/e21080726
_version_ 1783586774884286464
author Gosti, Giorgio
Folli, Viola
Leonetti, Marco
Ruocco, Giancarlo
author_facet Gosti, Giorgio
Folli, Viola
Leonetti, Marco
Ruocco, Giancarlo
author_sort Gosti, Giorgio
collection PubMed
description In a neural network, an autapse is a particular kind of synapse that links a neuron onto itself. Autapses are almost always not allowed neither in artificial nor in biological neural networks. Moreover, redundant or similar stored states tend to interact destructively. This paper shows how autapses together with stable state redundancy can improve the storage capacity of a recurrent neural network. Recent research shows how, in an N-node Hopfield neural network with autapses, the number of stored patterns (P) is not limited to the well known bound [Formula: see text] , as it is for networks without autapses. More precisely, it describes how, as the number of stored patterns increases well over the [Formula: see text] threshold, for P much greater than N, the retrieval error asymptotically approaches a value below the unit. Consequently, the reduction of retrieval errors allows a number of stored memories, which largely exceeds what was previously considered possible. Unfortunately, soon after, new results showed that, in the thermodynamic limit, given a network with autapses in this high-storage regime, the basin of attraction of the stored memories shrinks to a single state. This means that, for each stable state associated with a stored memory, even a single bit error in the initial pattern would lead the system to a stationary state associated with a different memory state. This thus limits the potential use of this kind of Hopfield network as an associative memory. This paper presents a strategy to overcome this limitation by improving the error correcting characteristics of the Hopfield neural network. The proposed strategy allows us to form what we call an absorbing-neighborhood of state surrounding each stored memory. An absorbing-neighborhood is a set defined by a Hamming distance surrounding a network state, which is an absorbing because, in the long-time limit, states inside it are absorbed by stable states in the set. We show that this strategy allows the network to store an exponential number of memory patterns, each surrounded with an absorbing-neighborhood with an exponentially growing size.
format Online
Article
Text
id pubmed-7515255
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-75152552020-11-09 Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks Gosti, Giorgio Folli, Viola Leonetti, Marco Ruocco, Giancarlo Entropy (Basel) Article In a neural network, an autapse is a particular kind of synapse that links a neuron onto itself. Autapses are almost always not allowed neither in artificial nor in biological neural networks. Moreover, redundant or similar stored states tend to interact destructively. This paper shows how autapses together with stable state redundancy can improve the storage capacity of a recurrent neural network. Recent research shows how, in an N-node Hopfield neural network with autapses, the number of stored patterns (P) is not limited to the well known bound [Formula: see text] , as it is for networks without autapses. More precisely, it describes how, as the number of stored patterns increases well over the [Formula: see text] threshold, for P much greater than N, the retrieval error asymptotically approaches a value below the unit. Consequently, the reduction of retrieval errors allows a number of stored memories, which largely exceeds what was previously considered possible. Unfortunately, soon after, new results showed that, in the thermodynamic limit, given a network with autapses in this high-storage regime, the basin of attraction of the stored memories shrinks to a single state. This means that, for each stable state associated with a stored memory, even a single bit error in the initial pattern would lead the system to a stationary state associated with a different memory state. This thus limits the potential use of this kind of Hopfield network as an associative memory. This paper presents a strategy to overcome this limitation by improving the error correcting characteristics of the Hopfield neural network. The proposed strategy allows us to form what we call an absorbing-neighborhood of state surrounding each stored memory. An absorbing-neighborhood is a set defined by a Hamming distance surrounding a network state, which is an absorbing because, in the long-time limit, states inside it are absorbed by stable states in the set. We show that this strategy allows the network to store an exponential number of memory patterns, each surrounded with an absorbing-neighborhood with an exponentially growing size. MDPI 2019-07-25 /pmc/articles/PMC7515255/ /pubmed/33267440 http://dx.doi.org/10.3390/e21080726 Text en © 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Gosti, Giorgio
Folli, Viola
Leonetti, Marco
Ruocco, Giancarlo
Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks
title Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks
title_full Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks
title_fullStr Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks
title_full_unstemmed Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks
title_short Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks
title_sort beyond the maximum storage capacity limit in hopfield recurrent neural networks
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7515255/
https://www.ncbi.nlm.nih.gov/pubmed/33267440
http://dx.doi.org/10.3390/e21080726
work_keys_str_mv AT gostigiorgio beyondthemaximumstoragecapacitylimitinhopfieldrecurrentneuralnetworks
AT folliviola beyondthemaximumstoragecapacitylimitinhopfieldrecurrentneuralnetworks
AT leonettimarco beyondthemaximumstoragecapacitylimitinhopfieldrecurrentneuralnetworks
AT ruoccogiancarlo beyondthemaximumstoragecapacitylimitinhopfieldrecurrentneuralnetworks