Cargando…

Event-based backpropagation can compute exact gradients for spiking neural networks

Spiking neural networks combine analog computation with event-based communication using discrete spikes. While the impressive advances of deep learning are enabled by training non-spiking artificial neural networks using the backpropagation algorithm, applying this algorithm to spiking networks was...

Descripción completa

Detalles Bibliográficos
Autores principales: Wunderlich, Timo C., Pehle, Christian
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8213775/
https://www.ncbi.nlm.nih.gov/pubmed/34145314
http://dx.doi.org/10.1038/s41598-021-91786-z
_version_ 1783709924931403776
author Wunderlich, Timo C.
Pehle, Christian
author_facet Wunderlich, Timo C.
Pehle, Christian
author_sort Wunderlich, Timo C.
collection PubMed
description Spiking neural networks combine analog computation with event-based communication using discrete spikes. While the impressive advances of deep learning are enabled by training non-spiking artificial neural networks using the backpropagation algorithm, applying this algorithm to spiking networks was previously hindered by the existence of discrete spike events and discontinuities. For the first time, this work derives the backpropagation algorithm for a continuous-time spiking neural network and a general loss function by applying the adjoint method together with the proper partial derivative jumps, allowing for backpropagation through discrete spike events without approximations. This algorithm, EventProp, backpropagates errors at spike times in order to compute the exact gradient in an event-based, temporally and spatially sparse fashion. We use gradients computed via EventProp to train networks on the Yin-Yang and MNIST datasets using either a spike time or voltage based loss function and report competitive performance. Our work supports the rigorous study of gradient-based learning algorithms in spiking neural networks and provides insights toward their implementation in novel brain-inspired hardware.
format Online
Article
Text
id pubmed-8213775
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-82137752021-06-22 Event-based backpropagation can compute exact gradients for spiking neural networks Wunderlich, Timo C. Pehle, Christian Sci Rep Article Spiking neural networks combine analog computation with event-based communication using discrete spikes. While the impressive advances of deep learning are enabled by training non-spiking artificial neural networks using the backpropagation algorithm, applying this algorithm to spiking networks was previously hindered by the existence of discrete spike events and discontinuities. For the first time, this work derives the backpropagation algorithm for a continuous-time spiking neural network and a general loss function by applying the adjoint method together with the proper partial derivative jumps, allowing for backpropagation through discrete spike events without approximations. This algorithm, EventProp, backpropagates errors at spike times in order to compute the exact gradient in an event-based, temporally and spatially sparse fashion. We use gradients computed via EventProp to train networks on the Yin-Yang and MNIST datasets using either a spike time or voltage based loss function and report competitive performance. Our work supports the rigorous study of gradient-based learning algorithms in spiking neural networks and provides insights toward their implementation in novel brain-inspired hardware. Nature Publishing Group UK 2021-06-18 /pmc/articles/PMC8213775/ /pubmed/34145314 http://dx.doi.org/10.1038/s41598-021-91786-z Text en © The Author(s) 2021 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Wunderlich, Timo C.
Pehle, Christian
Event-based backpropagation can compute exact gradients for spiking neural networks
title Event-based backpropagation can compute exact gradients for spiking neural networks
title_full Event-based backpropagation can compute exact gradients for spiking neural networks
title_fullStr Event-based backpropagation can compute exact gradients for spiking neural networks
title_full_unstemmed Event-based backpropagation can compute exact gradients for spiking neural networks
title_short Event-based backpropagation can compute exact gradients for spiking neural networks
title_sort event-based backpropagation can compute exact gradients for spiking neural networks
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8213775/
https://www.ncbi.nlm.nih.gov/pubmed/34145314
http://dx.doi.org/10.1038/s41598-021-91786-z
work_keys_str_mv AT wunderlichtimoc eventbasedbackpropagationcancomputeexactgradientsforspikingneuralnetworks
AT pehlechristian eventbasedbackpropagationcancomputeexactgradientsforspikingneuralnetworks