Cargando…

Unsupervised learning for robust working memory

Working memory is a core component of critical cognitive functions such as planning and decision-making. Persistent activity that lasts long after the stimulus offset has been considered a neural substrate for working memory. Attractor dynamics based on network interactions can successfully reproduc...

Descripción completa

Detalles Bibliográficos
Autores principales: Gu, Jintao, Lim, Sukbin
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9098088/
https://www.ncbi.nlm.nih.gov/pubmed/35500033
http://dx.doi.org/10.1371/journal.pcbi.1009083
_version_ 1784706305429078016
author Gu, Jintao
Lim, Sukbin
author_facet Gu, Jintao
Lim, Sukbin
author_sort Gu, Jintao
collection PubMed
description Working memory is a core component of critical cognitive functions such as planning and decision-making. Persistent activity that lasts long after the stimulus offset has been considered a neural substrate for working memory. Attractor dynamics based on network interactions can successfully reproduce such persistent activity. However, it requires a fine-tuning of network connectivity, in particular, to form continuous attractors which were suggested for encoding continuous signals in working memory. Here, we investigate whether a specific form of synaptic plasticity rules can mitigate such tuning problems in two representative working memory models, namely, rate-coded and location-coded persistent activity. We consider two prominent types of plasticity rules, differential plasticity correcting the rapid activity changes and homeostatic plasticity regularizing the long-term average of activity, both of which have been proposed to fine-tune the weights in an unsupervised manner. Consistent with the findings of previous works, differential plasticity alone was enough to recover a graded-level persistent activity after perturbations in the connectivity. For the location-coded memory, differential plasticity could also recover persistent activity. However, its pattern can be irregular for different stimulus locations under slow learning speed or large perturbation in the connectivity. On the other hand, homeostatic plasticity shows a robust recovery of smooth spatial patterns under particular types of synaptic perturbations, such as perturbations in incoming synapses onto the entire or local populations. However, homeostatic plasticity was not effective against perturbations in outgoing synapses from local populations. Instead, combining it with differential plasticity recovers location-coded persistent activity for a broader range of perturbations, suggesting compensation between two plasticity rules.
format Online
Article
Text
id pubmed-9098088
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-90980882022-05-13 Unsupervised learning for robust working memory Gu, Jintao Lim, Sukbin PLoS Comput Biol Research Article Working memory is a core component of critical cognitive functions such as planning and decision-making. Persistent activity that lasts long after the stimulus offset has been considered a neural substrate for working memory. Attractor dynamics based on network interactions can successfully reproduce such persistent activity. However, it requires a fine-tuning of network connectivity, in particular, to form continuous attractors which were suggested for encoding continuous signals in working memory. Here, we investigate whether a specific form of synaptic plasticity rules can mitigate such tuning problems in two representative working memory models, namely, rate-coded and location-coded persistent activity. We consider two prominent types of plasticity rules, differential plasticity correcting the rapid activity changes and homeostatic plasticity regularizing the long-term average of activity, both of which have been proposed to fine-tune the weights in an unsupervised manner. Consistent with the findings of previous works, differential plasticity alone was enough to recover a graded-level persistent activity after perturbations in the connectivity. For the location-coded memory, differential plasticity could also recover persistent activity. However, its pattern can be irregular for different stimulus locations under slow learning speed or large perturbation in the connectivity. On the other hand, homeostatic plasticity shows a robust recovery of smooth spatial patterns under particular types of synaptic perturbations, such as perturbations in incoming synapses onto the entire or local populations. However, homeostatic plasticity was not effective against perturbations in outgoing synapses from local populations. Instead, combining it with differential plasticity recovers location-coded persistent activity for a broader range of perturbations, suggesting compensation between two plasticity rules. Public Library of Science 2022-05-02 /pmc/articles/PMC9098088/ /pubmed/35500033 http://dx.doi.org/10.1371/journal.pcbi.1009083 Text en © 2022 Gu, Lim https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Gu, Jintao
Lim, Sukbin
Unsupervised learning for robust working memory
title Unsupervised learning for robust working memory
title_full Unsupervised learning for robust working memory
title_fullStr Unsupervised learning for robust working memory
title_full_unstemmed Unsupervised learning for robust working memory
title_short Unsupervised learning for robust working memory
title_sort unsupervised learning for robust working memory
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9098088/
https://www.ncbi.nlm.nih.gov/pubmed/35500033
http://dx.doi.org/10.1371/journal.pcbi.1009083
work_keys_str_mv AT gujintao unsupervisedlearningforrobustworkingmemory
AT limsukbin unsupervisedlearningforrobustworkingmemory