Cargando…

Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation

Artificial neural networks overwrite previously learned tasks when trained sequentially, a phenomenon known as catastrophic forgetting. In contrast, the brain learns continuously, and typically learns best when new training is interleaved with periods of sleep for memory consolidation. Here we used...

Descripción completa

Detalles Bibliográficos
Autores principales: Golden, Ryan, Delanois, Jean Erik, Sanda, Pavel, Bazhenov, Maxim
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9674146/
https://www.ncbi.nlm.nih.gov/pubmed/36399437
http://dx.doi.org/10.1371/journal.pcbi.1010628
_version_ 1784833091062202368
author Golden, Ryan
Delanois, Jean Erik
Sanda, Pavel
Bazhenov, Maxim
author_facet Golden, Ryan
Delanois, Jean Erik
Sanda, Pavel
Bazhenov, Maxim
author_sort Golden, Ryan
collection PubMed
description Artificial neural networks overwrite previously learned tasks when trained sequentially, a phenomenon known as catastrophic forgetting. In contrast, the brain learns continuously, and typically learns best when new training is interleaved with periods of sleep for memory consolidation. Here we used spiking network to study mechanisms behind catastrophic forgetting and the role of sleep in preventing it. The network could be trained to learn a complex foraging task but exhibited catastrophic forgetting when trained sequentially on different tasks. In synaptic weight space, new task training moved the synaptic weight configuration away from the manifold representing old task leading to forgetting. Interleaving new task training with periods of off-line reactivation, mimicking biological sleep, mitigated catastrophic forgetting by constraining the network synaptic weight state to the previously learned manifold, while allowing the weight configuration to converge towards the intersection of the manifolds representing old and new tasks. The study reveals a possible strategy of synaptic weights dynamics the brain applies during sleep to prevent forgetting and optimize learning.
format Online
Article
Text
id pubmed-9674146
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-96741462022-11-19 Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation Golden, Ryan Delanois, Jean Erik Sanda, Pavel Bazhenov, Maxim PLoS Comput Biol Research Article Artificial neural networks overwrite previously learned tasks when trained sequentially, a phenomenon known as catastrophic forgetting. In contrast, the brain learns continuously, and typically learns best when new training is interleaved with periods of sleep for memory consolidation. Here we used spiking network to study mechanisms behind catastrophic forgetting and the role of sleep in preventing it. The network could be trained to learn a complex foraging task but exhibited catastrophic forgetting when trained sequentially on different tasks. In synaptic weight space, new task training moved the synaptic weight configuration away from the manifold representing old task leading to forgetting. Interleaving new task training with periods of off-line reactivation, mimicking biological sleep, mitigated catastrophic forgetting by constraining the network synaptic weight state to the previously learned manifold, while allowing the weight configuration to converge towards the intersection of the manifolds representing old and new tasks. The study reveals a possible strategy of synaptic weights dynamics the brain applies during sleep to prevent forgetting and optimize learning. Public Library of Science 2022-11-18 /pmc/articles/PMC9674146/ /pubmed/36399437 http://dx.doi.org/10.1371/journal.pcbi.1010628 Text en © 2022 Golden et al https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Golden, Ryan
Delanois, Jean Erik
Sanda, Pavel
Bazhenov, Maxim
Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation
title Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation
title_full Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation
title_fullStr Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation
title_full_unstemmed Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation
title_short Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation
title_sort sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9674146/
https://www.ncbi.nlm.nih.gov/pubmed/36399437
http://dx.doi.org/10.1371/journal.pcbi.1010628
work_keys_str_mv AT goldenryan sleeppreventscatastrophicforgettinginspikingneuralnetworksbyformingajointsynapticweightrepresentation
AT delanoisjeanerik sleeppreventscatastrophicforgettinginspikingneuralnetworksbyformingajointsynapticweightrepresentation
AT sandapavel sleeppreventscatastrophicforgettinginspikingneuralnetworksbyformingajointsynapticweightrepresentation
AT bazhenovmaxim sleeppreventscatastrophicforgettinginspikingneuralnetworksbyformingajointsynapticweightrepresentation