Cargando…

Equilibrium Propagation: Bridging the Gap between Energy-Based Models and Backpropagation

We introduce Equilibrium Propagation, a learning framework for energy-based models. It involves only one kind of neural computation, performed in both the first phase (when the prediction is made) and the second phase of training (after the target or prediction error is revealed). Although this algo...

Descripción completa

Detalles Bibliográficos
Autores principales: Scellier, Benjamin, Bengio, Yoshua
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2017
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5415673/
https://www.ncbi.nlm.nih.gov/pubmed/28522969
http://dx.doi.org/10.3389/fncom.2017.00024
_version_ 1783233568660520960
author Scellier, Benjamin
Bengio, Yoshua
author_facet Scellier, Benjamin
Bengio, Yoshua
author_sort Scellier, Benjamin
collection PubMed
description We introduce Equilibrium Propagation, a learning framework for energy-based models. It involves only one kind of neural computation, performed in both the first phase (when the prediction is made) and the second phase of training (after the target or prediction error is revealed). Although this algorithm computes the gradient of an objective function just like Backpropagation, it does not need a special computation or circuit for the second phase, where errors are implicitly propagated. Equilibrium Propagation shares similarities with Contrastive Hebbian Learning and Contrastive Divergence while solving the theoretical issues of both algorithms: our algorithm computes the gradient of a well-defined objective function. Because the objective function is defined in terms of local perturbations, the second phase of Equilibrium Propagation corresponds to only nudging the prediction (fixed point or stationary distribution) toward a configuration that reduces prediction error. In the case of a recurrent multi-layer supervised network, the output units are slightly nudged toward their target in the second phase, and the perturbation introduced at the output layer propagates backward in the hidden layers. We show that the signal “back-propagated” during this second phase corresponds to the propagation of error derivatives and encodes the gradient of the objective function, when the synaptic update corresponds to a standard form of spike-timing dependent plasticity. This work makes it more plausible that a mechanism similar to Backpropagation could be implemented by brains, since leaky integrator neural computation performs both inference and error back-propagation in our model. The only local difference between the two phases is whether synaptic changes are allowed or not. We also show experimentally that multi-layer recurrently connected networks with 1, 2, and 3 hidden layers can be trained by Equilibrium Propagation on the permutation-invariant MNIST task.
format Online
Article
Text
id pubmed-5415673
institution National Center for Biotechnology Information
language English
publishDate 2017
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-54156732017-05-18 Equilibrium Propagation: Bridging the Gap between Energy-Based Models and Backpropagation Scellier, Benjamin Bengio, Yoshua Front Comput Neurosci Neuroscience We introduce Equilibrium Propagation, a learning framework for energy-based models. It involves only one kind of neural computation, performed in both the first phase (when the prediction is made) and the second phase of training (after the target or prediction error is revealed). Although this algorithm computes the gradient of an objective function just like Backpropagation, it does not need a special computation or circuit for the second phase, where errors are implicitly propagated. Equilibrium Propagation shares similarities with Contrastive Hebbian Learning and Contrastive Divergence while solving the theoretical issues of both algorithms: our algorithm computes the gradient of a well-defined objective function. Because the objective function is defined in terms of local perturbations, the second phase of Equilibrium Propagation corresponds to only nudging the prediction (fixed point or stationary distribution) toward a configuration that reduces prediction error. In the case of a recurrent multi-layer supervised network, the output units are slightly nudged toward their target in the second phase, and the perturbation introduced at the output layer propagates backward in the hidden layers. We show that the signal “back-propagated” during this second phase corresponds to the propagation of error derivatives and encodes the gradient of the objective function, when the synaptic update corresponds to a standard form of spike-timing dependent plasticity. This work makes it more plausible that a mechanism similar to Backpropagation could be implemented by brains, since leaky integrator neural computation performs both inference and error back-propagation in our model. The only local difference between the two phases is whether synaptic changes are allowed or not. We also show experimentally that multi-layer recurrently connected networks with 1, 2, and 3 hidden layers can be trained by Equilibrium Propagation on the permutation-invariant MNIST task. Frontiers Media S.A. 2017-05-04 /pmc/articles/PMC5415673/ /pubmed/28522969 http://dx.doi.org/10.3389/fncom.2017.00024 Text en Copyright © 2017 Scellier and Bengio. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Scellier, Benjamin
Bengio, Yoshua
Equilibrium Propagation: Bridging the Gap between Energy-Based Models and Backpropagation
title Equilibrium Propagation: Bridging the Gap between Energy-Based Models and Backpropagation
title_full Equilibrium Propagation: Bridging the Gap between Energy-Based Models and Backpropagation
title_fullStr Equilibrium Propagation: Bridging the Gap between Energy-Based Models and Backpropagation
title_full_unstemmed Equilibrium Propagation: Bridging the Gap between Energy-Based Models and Backpropagation
title_short Equilibrium Propagation: Bridging the Gap between Energy-Based Models and Backpropagation
title_sort equilibrium propagation: bridging the gap between energy-based models and backpropagation
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5415673/
https://www.ncbi.nlm.nih.gov/pubmed/28522969
http://dx.doi.org/10.3389/fncom.2017.00024
work_keys_str_mv AT scellierbenjamin equilibriumpropagationbridgingthegapbetweenenergybasedmodelsandbackpropagation
AT bengioyoshua equilibriumpropagationbridgingthegapbetweenenergybasedmodelsandbackpropagation