Cargando…
Stochastic Synapses Enable Efficient Brain-Inspired Learning Machines
Recent studies have shown that synaptic unreliability is a robust and sufficient mechanism for inducing the stochasticity observed in cortex. Here, we introduce Synaptic Sampling Machines (S2Ms), a class of neural network models that uses synaptic stochasticity as a means to Monte Carlo sampling and...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2016
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4925698/ https://www.ncbi.nlm.nih.gov/pubmed/27445650 http://dx.doi.org/10.3389/fnins.2016.00241 |
_version_ | 1782439983086305280 |
---|---|
author | Neftci, Emre O. Pedroni, Bruno U. Joshi, Siddharth Al-Shedivat, Maruan Cauwenberghs, Gert |
author_facet | Neftci, Emre O. Pedroni, Bruno U. Joshi, Siddharth Al-Shedivat, Maruan Cauwenberghs, Gert |
author_sort | Neftci, Emre O. |
collection | PubMed |
description | Recent studies have shown that synaptic unreliability is a robust and sufficient mechanism for inducing the stochasticity observed in cortex. Here, we introduce Synaptic Sampling Machines (S2Ms), a class of neural network models that uses synaptic stochasticity as a means to Monte Carlo sampling and unsupervised learning. Similar to the original formulation of Boltzmann machines, these models can be viewed as a stochastic counterpart of Hopfield networks, but where stochasticity is induced by a random mask over the connections. Synaptic stochasticity plays the dual role of an efficient mechanism for sampling, and a regularizer during learning akin to DropConnect. A local synaptic plasticity rule implementing an event-driven form of contrastive divergence enables the learning of generative models in an on-line fashion. S2Ms perform equally well using discrete-timed artificial units (as in Hopfield networks) or continuous-timed leaky integrate and fire neurons. The learned representations are remarkably sparse and robust to reductions in bit precision and synapse pruning: removal of more than 75% of the weakest connections followed by cursory re-learning causes a negligible performance loss on benchmark classification tasks. The spiking neuron-based S2Ms outperform existing spike-based unsupervised learners, while potentially offering substantial advantages in terms of power and complexity, and are thus promising models for on-line learning in brain-inspired hardware. |
format | Online Article Text |
id | pubmed-4925698 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2016 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-49256982016-07-21 Stochastic Synapses Enable Efficient Brain-Inspired Learning Machines Neftci, Emre O. Pedroni, Bruno U. Joshi, Siddharth Al-Shedivat, Maruan Cauwenberghs, Gert Front Neurosci Neuroscience Recent studies have shown that synaptic unreliability is a robust and sufficient mechanism for inducing the stochasticity observed in cortex. Here, we introduce Synaptic Sampling Machines (S2Ms), a class of neural network models that uses synaptic stochasticity as a means to Monte Carlo sampling and unsupervised learning. Similar to the original formulation of Boltzmann machines, these models can be viewed as a stochastic counterpart of Hopfield networks, but where stochasticity is induced by a random mask over the connections. Synaptic stochasticity plays the dual role of an efficient mechanism for sampling, and a regularizer during learning akin to DropConnect. A local synaptic plasticity rule implementing an event-driven form of contrastive divergence enables the learning of generative models in an on-line fashion. S2Ms perform equally well using discrete-timed artificial units (as in Hopfield networks) or continuous-timed leaky integrate and fire neurons. The learned representations are remarkably sparse and robust to reductions in bit precision and synapse pruning: removal of more than 75% of the weakest connections followed by cursory re-learning causes a negligible performance loss on benchmark classification tasks. The spiking neuron-based S2Ms outperform existing spike-based unsupervised learners, while potentially offering substantial advantages in terms of power and complexity, and are thus promising models for on-line learning in brain-inspired hardware. Frontiers Media S.A. 2016-06-29 /pmc/articles/PMC4925698/ /pubmed/27445650 http://dx.doi.org/10.3389/fnins.2016.00241 Text en Copyright © 2016 Neftci, Pedroni, Joshi, Al-Shedivat and Cauwenberghs. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Neuroscience Neftci, Emre O. Pedroni, Bruno U. Joshi, Siddharth Al-Shedivat, Maruan Cauwenberghs, Gert Stochastic Synapses Enable Efficient Brain-Inspired Learning Machines |
title | Stochastic Synapses Enable Efficient Brain-Inspired Learning Machines |
title_full | Stochastic Synapses Enable Efficient Brain-Inspired Learning Machines |
title_fullStr | Stochastic Synapses Enable Efficient Brain-Inspired Learning Machines |
title_full_unstemmed | Stochastic Synapses Enable Efficient Brain-Inspired Learning Machines |
title_short | Stochastic Synapses Enable Efficient Brain-Inspired Learning Machines |
title_sort | stochastic synapses enable efficient brain-inspired learning machines |
topic | Neuroscience |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4925698/ https://www.ncbi.nlm.nih.gov/pubmed/27445650 http://dx.doi.org/10.3389/fnins.2016.00241 |
work_keys_str_mv | AT neftciemreo stochasticsynapsesenableefficientbraininspiredlearningmachines AT pedronibrunou stochasticsynapsesenableefficientbraininspiredlearningmachines AT joshisiddharth stochasticsynapsesenableefficientbraininspiredlearningmachines AT alshedivatmaruan stochasticsynapsesenableefficientbraininspiredlearningmachines AT cauwenberghsgert stochasticsynapsesenableefficientbraininspiredlearningmachines |