Cargando…
Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation
Artificial neural networks overwrite previously learned tasks when trained sequentially, a phenomenon known as catastrophic forgetting. In contrast, the brain learns continuously, and typically learns best when new training is interleaved with periods of sleep for memory consolidation. Here we used...
Autores principales: | Golden, Ryan, Delanois, Jean Erik, Sanda, Pavel, Bazhenov, Maxim |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9674146/ https://www.ncbi.nlm.nih.gov/pubmed/36399437 http://dx.doi.org/10.1371/journal.pcbi.1010628 |
Ejemplares similares
-
Can sleep protect memories from catastrophic forgetting?
por: González, Oscar C, et al.
Publicado: (2020) -
Sleep-like unsupervised replay reduces catastrophic forgetting in artificial neural networks
por: Tadros, Timothy, et al.
Publicado: (2022) -
Multi-layer network utilizing rewarded spike time dependent plasticity to learn a foraging task
por: Sanda, Pavel, et al.
Publicado: (2017) -
A brain-inspired algorithm that mitigates catastrophic forgetting of artificial and spiking neural networks with low computational cost
por: Zhang, Tielin, et al.
Publicado: (2023) -
Structural Synaptic Plasticity Has High Memory Capacity and Can Explain Graded Amnesia, Catastrophic Forgetting, and the Spacing Effect
por: Knoblauch, Andreas, et al.
Publicado: (2014)