Cargando…

Layer-Skipping Connections Improve the Effectiveness of Equilibrium Propagation on Layered Networks

Equilibrium propagation is a learning framework that marks a step forward in the search for a biologically-plausible implementation of deep learning, and could be implemented efficiently in neuromorphic hardware. Previous applications of this framework to layered networks encountered a vanishing gra...

Descripción completa

Detalles Bibliográficos
Autores principales: Gammell, Jimmy, Buckley, Sonia, Nam, Sae Woo, McCaughan, Adam N.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8165608/
https://www.ncbi.nlm.nih.gov/pubmed/34079446
http://dx.doi.org/10.3389/fncom.2021.627357
Descripción
Sumario:Equilibrium propagation is a learning framework that marks a step forward in the search for a biologically-plausible implementation of deep learning, and could be implemented efficiently in neuromorphic hardware. Previous applications of this framework to layered networks encountered a vanishing gradient problem that has not yet been solved in a simple, biologically-plausible way. In this paper, we demonstrate that the vanishing gradient problem can be mitigated by replacing some of a layered network's connections with random layer-skipping connections in a manner inspired by small-world networks. This approach would be convenient to implement in neuromorphic hardware, and is biologically-plausible.