Cargando…
Random synaptic feedback weights support error backpropagation for deep learning
The brain processes information through multiple layers of neurons. This deep architecture is representationally powerful, but complicates learning because it is difficult to identify the responsible neurons when a mistake is made. In machine learning, the backpropagation algorithm assigns blame by...
Autores principales: | Lillicrap, Timothy P., Cownden, Daniel, Tweed, Douglas B., Akerman, Colin J. |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group
2016
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5105169/ https://www.ncbi.nlm.nih.gov/pubmed/27824044 http://dx.doi.org/10.1038/ncomms13276 |
Ejemplares similares
-
Somato-dendritic Synaptic Plasticity and Error-backpropagation in Active Dendrites
por: Schiess, Mathieu, et al.
Publicado: (2016) -
An Approximation of the Error Backpropagation Algorithm in a
Predictive Coding Network with Local Hebbian Synaptic Plasticity
por: Whittington, James C. R., et al.
Publicado: (2017) -
A Circuit-Based Neural Network with Hybrid Learning of Backpropagation and Random Weight Change Algorithms
por: Yang, Changju, et al.
Publicado: (2016) -
Learning sensitivity derivative by implicit supervision
por: Abdelghani, Mohamed N, et al.
Publicado: (2007) -
Fractional-Order Deep Backpropagation Neural Network
por: Bao, Chunhui, et al.
Publicado: (2018)