Cargando…
Training Deep Spiking Neural Networks Using Backpropagation
Deep spiking neural networks (SNNs) hold the potential for improving the latency and energy efficiency of deep neural networks through data-driven event-based computation. However, training such networks is difficult due to the non-differentiable nature of spike events. In this paper, we introduce a...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2016
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5099523/ https://www.ncbi.nlm.nih.gov/pubmed/27877107 http://dx.doi.org/10.3389/fnins.2016.00508 |
_version_ | 1782465980751937536 |
---|---|
author | Lee, Jun Haeng Delbruck, Tobi Pfeiffer, Michael |
author_facet | Lee, Jun Haeng Delbruck, Tobi Pfeiffer, Michael |
author_sort | Lee, Jun Haeng |
collection | PubMed |
description | Deep spiking neural networks (SNNs) hold the potential for improving the latency and energy efficiency of deep neural networks through data-driven event-based computation. However, training such networks is difficult due to the non-differentiable nature of spike events. In this paper, we introduce a novel technique, which treats the membrane potentials of spiking neurons as differentiable signals, where discontinuities at spike times are considered as noise. This enables an error backpropagation mechanism for deep SNNs that follows the same principles as in conventional deep networks, but works directly on spike signals and membrane potentials. Compared with previous methods relying on indirect training and conversion, our technique has the potential to capture the statistics of spikes more precisely. We evaluate the proposed framework on artificially generated events from the original MNIST handwritten digit benchmark, and also on the N-MNIST benchmark recorded with an event-based dynamic vision sensor, in which the proposed method reduces the error rate by a factor of more than three compared to the best previous SNN, and also achieves a higher accuracy than a conventional convolutional neural network (CNN) trained and tested on the same data. We demonstrate in the context of the MNIST task that thanks to their event-driven operation, deep SNNs (both fully connected and convolutional) trained with our method achieve accuracy equivalent with conventional neural networks. In the N-MNIST example, equivalent accuracy is achieved with about five times fewer computational operations. |
format | Online Article Text |
id | pubmed-5099523 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2016 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-50995232016-11-22 Training Deep Spiking Neural Networks Using Backpropagation Lee, Jun Haeng Delbruck, Tobi Pfeiffer, Michael Front Neurosci Neuroscience Deep spiking neural networks (SNNs) hold the potential for improving the latency and energy efficiency of deep neural networks through data-driven event-based computation. However, training such networks is difficult due to the non-differentiable nature of spike events. In this paper, we introduce a novel technique, which treats the membrane potentials of spiking neurons as differentiable signals, where discontinuities at spike times are considered as noise. This enables an error backpropagation mechanism for deep SNNs that follows the same principles as in conventional deep networks, but works directly on spike signals and membrane potentials. Compared with previous methods relying on indirect training and conversion, our technique has the potential to capture the statistics of spikes more precisely. We evaluate the proposed framework on artificially generated events from the original MNIST handwritten digit benchmark, and also on the N-MNIST benchmark recorded with an event-based dynamic vision sensor, in which the proposed method reduces the error rate by a factor of more than three compared to the best previous SNN, and also achieves a higher accuracy than a conventional convolutional neural network (CNN) trained and tested on the same data. We demonstrate in the context of the MNIST task that thanks to their event-driven operation, deep SNNs (both fully connected and convolutional) trained with our method achieve accuracy equivalent with conventional neural networks. In the N-MNIST example, equivalent accuracy is achieved with about five times fewer computational operations. Frontiers Media S.A. 2016-11-08 /pmc/articles/PMC5099523/ /pubmed/27877107 http://dx.doi.org/10.3389/fnins.2016.00508 Text en Copyright © 2016 Lee, Delbruck and Pfeiffer. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Neuroscience Lee, Jun Haeng Delbruck, Tobi Pfeiffer, Michael Training Deep Spiking Neural Networks Using Backpropagation |
title | Training Deep Spiking Neural Networks Using Backpropagation |
title_full | Training Deep Spiking Neural Networks Using Backpropagation |
title_fullStr | Training Deep Spiking Neural Networks Using Backpropagation |
title_full_unstemmed | Training Deep Spiking Neural Networks Using Backpropagation |
title_short | Training Deep Spiking Neural Networks Using Backpropagation |
title_sort | training deep spiking neural networks using backpropagation |
topic | Neuroscience |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5099523/ https://www.ncbi.nlm.nih.gov/pubmed/27877107 http://dx.doi.org/10.3389/fnins.2016.00508 |
work_keys_str_mv | AT leejunhaeng trainingdeepspikingneuralnetworksusingbackpropagation AT delbrucktobi trainingdeepspikingneuralnetworksusingbackpropagation AT pfeiffermichael trainingdeepspikingneuralnetworksusingbackpropagation |