Cargando…

Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study

The Hopfield model is a pioneering neural network model with associative memory retrieval. The analytical solution of the model in mean field limit revealed that memories can be retrieved without any error up to a finite storage capacity of O(N), where N is the system size. Beyond the threshold, the...

Descripción completa

Detalles Bibliográficos
Autores principales: Kim, Do-Hyun, Park, Jinha, Kahng, Byungnam
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2017
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5659639/
https://www.ncbi.nlm.nih.gov/pubmed/29077721
http://dx.doi.org/10.1371/journal.pone.0184683
_version_ 1783274195921141760
author Kim, Do-Hyun
Park, Jinha
Kahng, Byungnam
author_facet Kim, Do-Hyun
Park, Jinha
Kahng, Byungnam
author_sort Kim, Do-Hyun
collection PubMed
description The Hopfield model is a pioneering neural network model with associative memory retrieval. The analytical solution of the model in mean field limit revealed that memories can be retrieved without any error up to a finite storage capacity of O(N), where N is the system size. Beyond the threshold, they are completely lost. Since the introduction of the Hopfield model, the theory of neural networks has been further developed toward realistic neural networks using analog neurons, spiking neurons, etc. Nevertheless, those advances are based on fully connected networks, which are inconsistent with recent experimental discovery that the number of connections of each neuron seems to be heterogeneous, following a heavy-tailed distribution. Motivated by this observation, we consider the Hopfield model on scale-free networks and obtain a different pattern of associative memory retrieval from that obtained on the fully connected network: the storage capacity becomes tremendously enhanced but with some error in the memory retrieval, which appears as the heterogeneity of the connections is increased. Moreover, the error rates are also obtained on several real neural networks and are indeed similar to that on scale-free model networks.
format Online
Article
Text
id pubmed-5659639
institution National Center for Biotechnology Information
language English
publishDate 2017
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-56596392017-11-09 Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study Kim, Do-Hyun Park, Jinha Kahng, Byungnam PLoS One Research Article The Hopfield model is a pioneering neural network model with associative memory retrieval. The analytical solution of the model in mean field limit revealed that memories can be retrieved without any error up to a finite storage capacity of O(N), where N is the system size. Beyond the threshold, they are completely lost. Since the introduction of the Hopfield model, the theory of neural networks has been further developed toward realistic neural networks using analog neurons, spiking neurons, etc. Nevertheless, those advances are based on fully connected networks, which are inconsistent with recent experimental discovery that the number of connections of each neuron seems to be heterogeneous, following a heavy-tailed distribution. Motivated by this observation, we consider the Hopfield model on scale-free networks and obtain a different pattern of associative memory retrieval from that obtained on the fully connected network: the storage capacity becomes tremendously enhanced but with some error in the memory retrieval, which appears as the heterogeneity of the connections is increased. Moreover, the error rates are also obtained on several real neural networks and are indeed similar to that on scale-free model networks. Public Library of Science 2017-10-27 /pmc/articles/PMC5659639/ /pubmed/29077721 http://dx.doi.org/10.1371/journal.pone.0184683 Text en © 2017 Kim et al http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Kim, Do-Hyun
Park, Jinha
Kahng, Byungnam
Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study
title Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study
title_full Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study
title_fullStr Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study
title_full_unstemmed Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study
title_short Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study
title_sort enhanced storage capacity with errors in scale-free hopfield neural networks: an analytical study
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5659639/
https://www.ncbi.nlm.nih.gov/pubmed/29077721
http://dx.doi.org/10.1371/journal.pone.0184683
work_keys_str_mv AT kimdohyun enhancedstoragecapacitywitherrorsinscalefreehopfieldneuralnetworksananalyticalstudy
AT parkjinha enhancedstoragecapacitywitherrorsinscalefreehopfieldneuralnetworksananalyticalstudy
AT kahngbyungnam enhancedstoragecapacitywitherrorsinscalefreehopfieldneuralnetworksananalyticalstudy