Cargando…
Modelling continual learning in humans with Hebbian context gating and exponentially decaying task signals
Humans can learn several tasks in succession with minimal mutual interference but perform more poorly when trained on multiple tasks at once. The opposite is true for standard deep neural networks. Here, we propose novel computational constraints for artificial neural networks, inspired by earlier w...
Autores principales: | Flesch, Timo, Nagy, David G., Saxe, Andrew, Summerfield, Christopher |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9851563/ https://www.ncbi.nlm.nih.gov/pubmed/36656823 http://dx.doi.org/10.1371/journal.pcbi.1010808 |
Ejemplares similares
-
Orthogonal representations for robust context-dependent task performance in brains and neural networks
por: Flesch, Timo, et al.
Publicado: (2022) -
Orthogonal representations for robust context-dependent task performance in brains and neural networks
por: Flesch, Timo, et al.
Publicado: (2022) -
Comparing continual task learning in minds and machines
por: Flesch, Timo, et al.
Publicado: (2018) -
Multi-context blind source separation by error-gated Hebbian rule
por: Isomura, Takuya, et al.
Publicado: (2019) -
Hebbian learning of context in recurrent neural networks
por: Brunnel, N
Publicado: (1995)