Cargando…
Slow manifolds within network dynamics encode working memory efficiently and robustly
Working memory is a cognitive function involving the storage and manipulation of latent information over brief intervals of time, thus making it crucial for context-dependent computation. Here, we use a top-down modeling approach to examine network-level mechanisms of working memory, an enigmatic is...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8475983/ https://www.ncbi.nlm.nih.gov/pubmed/34525089 http://dx.doi.org/10.1371/journal.pcbi.1009366 |
_version_ | 1784575506307350528 |
---|---|
author | Ghazizadeh, Elham Ching, ShiNung |
author_facet | Ghazizadeh, Elham Ching, ShiNung |
author_sort | Ghazizadeh, Elham |
collection | PubMed |
description | Working memory is a cognitive function involving the storage and manipulation of latent information over brief intervals of time, thus making it crucial for context-dependent computation. Here, we use a top-down modeling approach to examine network-level mechanisms of working memory, an enigmatic issue and central topic of study in neuroscience. We optimize thousands of recurrent rate-based neural networks on a working memory task and then perform dynamical systems analysis on the ensuing optimized networks, wherein we find that four distinct dynamical mechanisms can emerge. In particular, we show the prevalence of a mechanism in which memories are encoded along slow stable manifolds in the network state space, leading to a phasic neuronal activation profile during memory periods. In contrast to mechanisms in which memories are directly encoded at stable attractors, these networks naturally forget stimuli over time. Despite this seeming functional disadvantage, they are more efficient in terms of how they leverage their attractor landscape and paradoxically, are considerably more robust to noise. Our results provide new hypotheses regarding how working memory function may be encoded within the dynamics of neural circuits. |
format | Online Article Text |
id | pubmed-8475983 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-84759832021-09-28 Slow manifolds within network dynamics encode working memory efficiently and robustly Ghazizadeh, Elham Ching, ShiNung PLoS Comput Biol Research Article Working memory is a cognitive function involving the storage and manipulation of latent information over brief intervals of time, thus making it crucial for context-dependent computation. Here, we use a top-down modeling approach to examine network-level mechanisms of working memory, an enigmatic issue and central topic of study in neuroscience. We optimize thousands of recurrent rate-based neural networks on a working memory task and then perform dynamical systems analysis on the ensuing optimized networks, wherein we find that four distinct dynamical mechanisms can emerge. In particular, we show the prevalence of a mechanism in which memories are encoded along slow stable manifolds in the network state space, leading to a phasic neuronal activation profile during memory periods. In contrast to mechanisms in which memories are directly encoded at stable attractors, these networks naturally forget stimuli over time. Despite this seeming functional disadvantage, they are more efficient in terms of how they leverage their attractor landscape and paradoxically, are considerably more robust to noise. Our results provide new hypotheses regarding how working memory function may be encoded within the dynamics of neural circuits. Public Library of Science 2021-09-15 /pmc/articles/PMC8475983/ /pubmed/34525089 http://dx.doi.org/10.1371/journal.pcbi.1009366 Text en © 2021 Ghazizadeh, Ching https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. |
spellingShingle | Research Article Ghazizadeh, Elham Ching, ShiNung Slow manifolds within network dynamics encode working memory efficiently and robustly |
title | Slow manifolds within network dynamics encode working memory efficiently and robustly |
title_full | Slow manifolds within network dynamics encode working memory efficiently and robustly |
title_fullStr | Slow manifolds within network dynamics encode working memory efficiently and robustly |
title_full_unstemmed | Slow manifolds within network dynamics encode working memory efficiently and robustly |
title_short | Slow manifolds within network dynamics encode working memory efficiently and robustly |
title_sort | slow manifolds within network dynamics encode working memory efficiently and robustly |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8475983/ https://www.ncbi.nlm.nih.gov/pubmed/34525089 http://dx.doi.org/10.1371/journal.pcbi.1009366 |
work_keys_str_mv | AT ghazizadehelham slowmanifoldswithinnetworkdynamicsencodeworkingmemoryefficientlyandrobustly AT chingshinung slowmanifoldswithinnetworkdynamicsencodeworkingmemoryefficientlyandrobustly |