Cargando…

High-accuracy deep ANN-to-SNN conversion using quantization-aware training framework and calcium-gated bipolar leaky integrate and fire neuron

Spiking neural networks (SNNs) have attracted intensive attention due to the efficient event-driven computing paradigm. Among SNN training methods, the ANN-to-SNN conversion is usually regarded to achieve state-of-the-art recognition accuracies. However, many existing ANN-to-SNN techniques impose le...

Descripción completa

Detalles Bibliográficos
Autores principales: Gao, Haoran, He, Junxian, Wang, Haibing, Wang, Tengxiao, Zhong, Zhengqing, Yu, Jianyi, Wang, Ying, Tian, Min, Shi, Cong
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10030499/
https://www.ncbi.nlm.nih.gov/pubmed/36968504
http://dx.doi.org/10.3389/fnins.2023.1141701
_version_ 1784910388221968384
author Gao, Haoran
He, Junxian
Wang, Haibing
Wang, Tengxiao
Zhong, Zhengqing
Yu, Jianyi
Wang, Ying
Tian, Min
Shi, Cong
author_facet Gao, Haoran
He, Junxian
Wang, Haibing
Wang, Tengxiao
Zhong, Zhengqing
Yu, Jianyi
Wang, Ying
Tian, Min
Shi, Cong
author_sort Gao, Haoran
collection PubMed
description Spiking neural networks (SNNs) have attracted intensive attention due to the efficient event-driven computing paradigm. Among SNN training methods, the ANN-to-SNN conversion is usually regarded to achieve state-of-the-art recognition accuracies. However, many existing ANN-to-SNN techniques impose lengthy post-conversion steps like threshold balancing and weight renormalization, to compensate for the inherent behavioral discrepancy between artificial and spiking neurons. In addition, they require a long temporal window to encode and process as many spikes as possible to better approximate the real-valued ANN neurons, leading to a high inference latency. To overcome these challenges, we propose a calcium-gated bipolar leaky integrate and fire (Ca-LIF) spiking neuron model to better approximate the functions of the ReLU neurons widely adopted in ANNs. We also propose a quantization-aware training (QAT)-based framework leveraging an off-the-shelf QAT toolkit for easy ANN-to-SNN conversion, which directly exports the learned ANN weights to SNNs requiring no post-conversion processing. We benchmarked our method on typical deep network structures with varying time-step lengths from 8 to 128. Compared to other research, our converted SNNs reported competitively high-accuracy performance, while enjoying relatively short inference time steps.
format Online
Article
Text
id pubmed-10030499
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-100304992023-03-23 High-accuracy deep ANN-to-SNN conversion using quantization-aware training framework and calcium-gated bipolar leaky integrate and fire neuron Gao, Haoran He, Junxian Wang, Haibing Wang, Tengxiao Zhong, Zhengqing Yu, Jianyi Wang, Ying Tian, Min Shi, Cong Front Neurosci Neuroscience Spiking neural networks (SNNs) have attracted intensive attention due to the efficient event-driven computing paradigm. Among SNN training methods, the ANN-to-SNN conversion is usually regarded to achieve state-of-the-art recognition accuracies. However, many existing ANN-to-SNN techniques impose lengthy post-conversion steps like threshold balancing and weight renormalization, to compensate for the inherent behavioral discrepancy between artificial and spiking neurons. In addition, they require a long temporal window to encode and process as many spikes as possible to better approximate the real-valued ANN neurons, leading to a high inference latency. To overcome these challenges, we propose a calcium-gated bipolar leaky integrate and fire (Ca-LIF) spiking neuron model to better approximate the functions of the ReLU neurons widely adopted in ANNs. We also propose a quantization-aware training (QAT)-based framework leveraging an off-the-shelf QAT toolkit for easy ANN-to-SNN conversion, which directly exports the learned ANN weights to SNNs requiring no post-conversion processing. We benchmarked our method on typical deep network structures with varying time-step lengths from 8 to 128. Compared to other research, our converted SNNs reported competitively high-accuracy performance, while enjoying relatively short inference time steps. Frontiers Media S.A. 2023-03-08 /pmc/articles/PMC10030499/ /pubmed/36968504 http://dx.doi.org/10.3389/fnins.2023.1141701 Text en Copyright © 2023 Gao, He, Wang, Wang, Zhong, Yu, Wang, Tian and Shi. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Gao, Haoran
He, Junxian
Wang, Haibing
Wang, Tengxiao
Zhong, Zhengqing
Yu, Jianyi
Wang, Ying
Tian, Min
Shi, Cong
High-accuracy deep ANN-to-SNN conversion using quantization-aware training framework and calcium-gated bipolar leaky integrate and fire neuron
title High-accuracy deep ANN-to-SNN conversion using quantization-aware training framework and calcium-gated bipolar leaky integrate and fire neuron
title_full High-accuracy deep ANN-to-SNN conversion using quantization-aware training framework and calcium-gated bipolar leaky integrate and fire neuron
title_fullStr High-accuracy deep ANN-to-SNN conversion using quantization-aware training framework and calcium-gated bipolar leaky integrate and fire neuron
title_full_unstemmed High-accuracy deep ANN-to-SNN conversion using quantization-aware training framework and calcium-gated bipolar leaky integrate and fire neuron
title_short High-accuracy deep ANN-to-SNN conversion using quantization-aware training framework and calcium-gated bipolar leaky integrate and fire neuron
title_sort high-accuracy deep ann-to-snn conversion using quantization-aware training framework and calcium-gated bipolar leaky integrate and fire neuron
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10030499/
https://www.ncbi.nlm.nih.gov/pubmed/36968504
http://dx.doi.org/10.3389/fnins.2023.1141701
work_keys_str_mv AT gaohaoran highaccuracydeepanntosnnconversionusingquantizationawaretrainingframeworkandcalciumgatedbipolarleakyintegrateandfireneuron
AT hejunxian highaccuracydeepanntosnnconversionusingquantizationawaretrainingframeworkandcalciumgatedbipolarleakyintegrateandfireneuron
AT wanghaibing highaccuracydeepanntosnnconversionusingquantizationawaretrainingframeworkandcalciumgatedbipolarleakyintegrateandfireneuron
AT wangtengxiao highaccuracydeepanntosnnconversionusingquantizationawaretrainingframeworkandcalciumgatedbipolarleakyintegrateandfireneuron
AT zhongzhengqing highaccuracydeepanntosnnconversionusingquantizationawaretrainingframeworkandcalciumgatedbipolarleakyintegrateandfireneuron
AT yujianyi highaccuracydeepanntosnnconversionusingquantizationawaretrainingframeworkandcalciumgatedbipolarleakyintegrateandfireneuron
AT wangying highaccuracydeepanntosnnconversionusingquantizationawaretrainingframeworkandcalciumgatedbipolarleakyintegrateandfireneuron
AT tianmin highaccuracydeepanntosnnconversionusingquantizationawaretrainingframeworkandcalciumgatedbipolarleakyintegrateandfireneuron
AT shicong highaccuracydeepanntosnnconversionusingquantizationawaretrainingframeworkandcalciumgatedbipolarleakyintegrateandfireneuron