Cargando…
Sleep-like unsupervised replay reduces catastrophic forgetting in artificial neural networks
Artificial neural networks are known to suffer from catastrophic forgetting: when learning multiple tasks sequentially, they perform well on the most recent task at the expense of previously learned tasks. In the brain, sleep is known to play an important role in incremental learning by replaying re...
Autores principales: | Tadros, Timothy, Krishnan, Giri P., Ramyaa, Ramyaa, Bazhenov, Maxim |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9755223/ https://www.ncbi.nlm.nih.gov/pubmed/36522325 http://dx.doi.org/10.1038/s41467-022-34938-7 |
Ejemplares similares
-
Can sleep protect memories from catastrophic forgetting?
por: González, Oscar C, et al.
Publicado: (2020) -
Mechanisms of hippocampal sequence replay
por: Malerba, Paola, et al.
Publicado: (2015) -
Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation
por: Golden, Ryan, et al.
Publicado: (2022) -
Hippocampal replay and cortical slow oscillations: a computational study
por: Malerba, Paola, et al.
Publicado: (2014) -
Phenotyping Women Based on Dietary Macronutrients, Physical Activity, and Body Weight Using Machine Learning Tools
por: Ramyaa, Ramyaa, et al.
Publicado: (2019)