Cargando…

Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation

Artificial neural networks overwrite previously learned tasks when trained sequentially, a phenomenon known as catastrophic forgetting. In contrast, the brain learns continuously, and typically learns best when new training is interleaved with periods of sleep for memory consolidation. Here we used...

Descripción completa

Detalles Bibliográficos
Autores principales: Golden, Ryan, Delanois, Jean Erik, Sanda, Pavel, Bazhenov, Maxim
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9674146/
https://www.ncbi.nlm.nih.gov/pubmed/36399437
http://dx.doi.org/10.1371/journal.pcbi.1010628