Cargando…

ALBSNN: ultra-low latency adaptive local binary spiking neural network with accuracy loss estimator

Spiking neural network (SNN) is a brain-inspired model with more spatio-temporal information processing capacity and computational energy efficiency. However, with the increasing depth of SNNs, the memory problem caused by the weights of SNNs has gradually attracted attention. In this study, we prop...

Descripción completa

Detalles Bibliográficos
Autores principales: Pei, Yijian, Xu, Changqing, Wu, Zili, Liu, Yi, Yang, Yintang
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10525310/
https://www.ncbi.nlm.nih.gov/pubmed/37771337
http://dx.doi.org/10.3389/fnins.2023.1225871
_version_ 1785110753349468160
author Pei, Yijian
Xu, Changqing
Wu, Zili
Liu, Yi
Yang, Yintang
author_facet Pei, Yijian
Xu, Changqing
Wu, Zili
Liu, Yi
Yang, Yintang
author_sort Pei, Yijian
collection PubMed
description Spiking neural network (SNN) is a brain-inspired model with more spatio-temporal information processing capacity and computational energy efficiency. However, with the increasing depth of SNNs, the memory problem caused by the weights of SNNs has gradually attracted attention. In this study, we propose an ultra-low latency adaptive local binary spiking neural network (ALBSNN) with accuracy loss estimators, which dynamically selects the network layers to be binarized to ensure a balance between quantization degree and classification accuracy by evaluating the error caused by the binarized weights during the network learning process. At the same time, to accelerate the training speed of the network, the global average pooling (GAP) layer is introduced to replace the fully connected layers by combining convolution and pooling. Finally, to further reduce the error caused by the binary weight, we propose binary weight optimization (BWO), which updates the overall weight by directly adjusting the binary weight. This method further reduces the loss of the network that reaches the training bottleneck. The combination of the above methods balances the network's quantization and recognition ability, enabling the network to maintain the recognition capability equivalent to the full precision network and reduce the storage space by more than 20%. So, SNNs can use a small number of time steps to obtain better recognition accuracy. In the extreme case of using only a one-time step, we still can achieve 93.39, 92.12, and 69.55% testing accuracy on three traditional static datasets, Fashion- MNIST, CIFAR-10, and CIFAR-100, respectively. At the same time, we evaluate our method on neuromorphic N-MNIST, CIFAR10-DVS, and IBM DVS128 Gesture datasets and achieve advanced accuracy in SNN with binary weights. Our network has greater advantages in terms of storage resources and training time.
format Online
Article
Text
id pubmed-10525310
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-105253102023-09-28 ALBSNN: ultra-low latency adaptive local binary spiking neural network with accuracy loss estimator Pei, Yijian Xu, Changqing Wu, Zili Liu, Yi Yang, Yintang Front Neurosci Neuroscience Spiking neural network (SNN) is a brain-inspired model with more spatio-temporal information processing capacity and computational energy efficiency. However, with the increasing depth of SNNs, the memory problem caused by the weights of SNNs has gradually attracted attention. In this study, we propose an ultra-low latency adaptive local binary spiking neural network (ALBSNN) with accuracy loss estimators, which dynamically selects the network layers to be binarized to ensure a balance between quantization degree and classification accuracy by evaluating the error caused by the binarized weights during the network learning process. At the same time, to accelerate the training speed of the network, the global average pooling (GAP) layer is introduced to replace the fully connected layers by combining convolution and pooling. Finally, to further reduce the error caused by the binary weight, we propose binary weight optimization (BWO), which updates the overall weight by directly adjusting the binary weight. This method further reduces the loss of the network that reaches the training bottleneck. The combination of the above methods balances the network's quantization and recognition ability, enabling the network to maintain the recognition capability equivalent to the full precision network and reduce the storage space by more than 20%. So, SNNs can use a small number of time steps to obtain better recognition accuracy. In the extreme case of using only a one-time step, we still can achieve 93.39, 92.12, and 69.55% testing accuracy on three traditional static datasets, Fashion- MNIST, CIFAR-10, and CIFAR-100, respectively. At the same time, we evaluate our method on neuromorphic N-MNIST, CIFAR10-DVS, and IBM DVS128 Gesture datasets and achieve advanced accuracy in SNN with binary weights. Our network has greater advantages in terms of storage resources and training time. Frontiers Media S.A. 2023-09-13 /pmc/articles/PMC10525310/ /pubmed/37771337 http://dx.doi.org/10.3389/fnins.2023.1225871 Text en Copyright © 2023 Pei, Xu, Wu, Liu and Yang. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Pei, Yijian
Xu, Changqing
Wu, Zili
Liu, Yi
Yang, Yintang
ALBSNN: ultra-low latency adaptive local binary spiking neural network with accuracy loss estimator
title ALBSNN: ultra-low latency adaptive local binary spiking neural network with accuracy loss estimator
title_full ALBSNN: ultra-low latency adaptive local binary spiking neural network with accuracy loss estimator
title_fullStr ALBSNN: ultra-low latency adaptive local binary spiking neural network with accuracy loss estimator
title_full_unstemmed ALBSNN: ultra-low latency adaptive local binary spiking neural network with accuracy loss estimator
title_short ALBSNN: ultra-low latency adaptive local binary spiking neural network with accuracy loss estimator
title_sort albsnn: ultra-low latency adaptive local binary spiking neural network with accuracy loss estimator
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10525310/
https://www.ncbi.nlm.nih.gov/pubmed/37771337
http://dx.doi.org/10.3389/fnins.2023.1225871
work_keys_str_mv AT peiyijian albsnnultralowlatencyadaptivelocalbinaryspikingneuralnetworkwithaccuracylossestimator
AT xuchangqing albsnnultralowlatencyadaptivelocalbinaryspikingneuralnetworkwithaccuracylossestimator
AT wuzili albsnnultralowlatencyadaptivelocalbinaryspikingneuralnetworkwithaccuracylossestimator
AT liuyi albsnnultralowlatencyadaptivelocalbinaryspikingneuralnetworkwithaccuracylossestimator
AT yangyintang albsnnultralowlatencyadaptivelocalbinaryspikingneuralnetworkwithaccuracylossestimator