Cargando…
Diffusion-based neuromodulation can eliminate catastrophic forgetting in simple neural networks
A long-term goal of AI is to produce agents that can learn a diversity of skills throughout their lifetimes and continuously improve those skills via experience. A longstanding obstacle towards that goal is catastrophic forgetting, which is when learning new information erases previously learned inf...
Autores principales: | Velez, Roby, Clune, Jeff |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2017
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5690421/ https://www.ncbi.nlm.nih.gov/pubmed/29145413 http://dx.doi.org/10.1371/journal.pone.0187736 |
Ejemplares similares
-
Can sleep protect memories from catastrophic forgetting?
por: González, Oscar C, et al.
Publicado: (2020) -
Sleep-like unsupervised replay reduces catastrophic forgetting in artificial neural networks
por: Tadros, Timothy, et al.
Publicado: (2022) -
Model architecture can transform catastrophic forgetting into positive transfer
por: Ruiz-Garcia, Miguel
Publicado: (2022) -
Neural Modularity Helps Organisms Evolve to Learn New Skills without Forgetting Old Skills
por: Ellefsen, Kai Olav, et al.
Publicado: (2015) -
Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation
por: Golden, Ryan, et al.
Publicado: (2022)