Cargando…

Backpropagation With Sparsity Regularization for Spiking Neural Network Learning

The spiking neural network (SNN) is a possible pathway for low-power and energy-efficient processing and computing exploiting spiking-driven and sparsity features of biological systems. This article proposes a sparsity-driven SNN learning algorithm, namely backpropagation with sparsity regularizatio...

Descripción completa

Detalles Bibliográficos
Autores principales: Yan, Yulong, Chu, Haoming, Jin, Yi, Huan, Yuxiang, Zou, Zhuo, Zheng, Lirong
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9047717/
https://www.ncbi.nlm.nih.gov/pubmed/35495028
http://dx.doi.org/10.3389/fnins.2022.760298
_version_ 1784695783565557760
author Yan, Yulong
Chu, Haoming
Jin, Yi
Huan, Yuxiang
Zou, Zhuo
Zheng, Lirong
author_facet Yan, Yulong
Chu, Haoming
Jin, Yi
Huan, Yuxiang
Zou, Zhuo
Zheng, Lirong
author_sort Yan, Yulong
collection PubMed
description The spiking neural network (SNN) is a possible pathway for low-power and energy-efficient processing and computing exploiting spiking-driven and sparsity features of biological systems. This article proposes a sparsity-driven SNN learning algorithm, namely backpropagation with sparsity regularization (BPSR), aiming to achieve improved spiking and synaptic sparsity. Backpropagation incorporating spiking regularization is utilized to minimize the spiking firing rate with guaranteed accuracy. Backpropagation realizes the temporal information capture and extends to the spiking recurrent layer to support brain-like structure learning. The rewiring mechanism with synaptic regularization is suggested to further mitigate the redundancy of the network structure. Rewiring based on weight and gradient regulates the pruning and growth of synapses. Experimental results demonstrate that the network learned by BPSR has synaptic sparsity and is highly similar to the biological system. It not only balances the accuracy and firing rate, but also facilitates SNN learning by suppressing the information redundancy. We evaluate the proposed BPSR on the visual dataset MNIST, N-MNIST, and CIFAR10, and further test it on the sensor dataset MIT-BIH and gas sensor. Results bespeak that our algorithm achieves comparable or superior accuracy compared to related works, with sparse spikes and synapses.
format Online
Article
Text
id pubmed-9047717
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-90477172022-04-29 Backpropagation With Sparsity Regularization for Spiking Neural Network Learning Yan, Yulong Chu, Haoming Jin, Yi Huan, Yuxiang Zou, Zhuo Zheng, Lirong Front Neurosci Neuroscience The spiking neural network (SNN) is a possible pathway for low-power and energy-efficient processing and computing exploiting spiking-driven and sparsity features of biological systems. This article proposes a sparsity-driven SNN learning algorithm, namely backpropagation with sparsity regularization (BPSR), aiming to achieve improved spiking and synaptic sparsity. Backpropagation incorporating spiking regularization is utilized to minimize the spiking firing rate with guaranteed accuracy. Backpropagation realizes the temporal information capture and extends to the spiking recurrent layer to support brain-like structure learning. The rewiring mechanism with synaptic regularization is suggested to further mitigate the redundancy of the network structure. Rewiring based on weight and gradient regulates the pruning and growth of synapses. Experimental results demonstrate that the network learned by BPSR has synaptic sparsity and is highly similar to the biological system. It not only balances the accuracy and firing rate, but also facilitates SNN learning by suppressing the information redundancy. We evaluate the proposed BPSR on the visual dataset MNIST, N-MNIST, and CIFAR10, and further test it on the sensor dataset MIT-BIH and gas sensor. Results bespeak that our algorithm achieves comparable or superior accuracy compared to related works, with sparse spikes and synapses. Frontiers Media S.A. 2022-04-14 /pmc/articles/PMC9047717/ /pubmed/35495028 http://dx.doi.org/10.3389/fnins.2022.760298 Text en Copyright © 2022 Yan, Chu, Jin, Huan, Zou and Zheng. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Yan, Yulong
Chu, Haoming
Jin, Yi
Huan, Yuxiang
Zou, Zhuo
Zheng, Lirong
Backpropagation With Sparsity Regularization for Spiking Neural Network Learning
title Backpropagation With Sparsity Regularization for Spiking Neural Network Learning
title_full Backpropagation With Sparsity Regularization for Spiking Neural Network Learning
title_fullStr Backpropagation With Sparsity Regularization for Spiking Neural Network Learning
title_full_unstemmed Backpropagation With Sparsity Regularization for Spiking Neural Network Learning
title_short Backpropagation With Sparsity Regularization for Spiking Neural Network Learning
title_sort backpropagation with sparsity regularization for spiking neural network learning
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9047717/
https://www.ncbi.nlm.nih.gov/pubmed/35495028
http://dx.doi.org/10.3389/fnins.2022.760298
work_keys_str_mv AT yanyulong backpropagationwithsparsityregularizationforspikingneuralnetworklearning
AT chuhaoming backpropagationwithsparsityregularizationforspikingneuralnetworklearning
AT jinyi backpropagationwithsparsityregularizationforspikingneuralnetworklearning
AT huanyuxiang backpropagationwithsparsityregularizationforspikingneuralnetworklearning
AT zouzhuo backpropagationwithsparsityregularizationforspikingneuralnetworklearning
AT zhenglirong backpropagationwithsparsityregularizationforspikingneuralnetworklearning