Cargando…

IDSNN: Towards High-Performance and Low-Latency SNN Training via Initialization and Distillation

Spiking neural networks (SNNs) are widely recognized for their biomimetic and efficient computing features. They utilize spikes to encode and transmit information. Despite the many advantages of SNNs, they suffer from the problems of low accuracy and large inference latency, which are, respectively,...

Descripción completa

Detalles Bibliográficos
Autores principales: Fan, Xiongfei, Zhang, Hong, Zhang, Yu
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10452895/
https://www.ncbi.nlm.nih.gov/pubmed/37622980
http://dx.doi.org/10.3390/biomimetics8040375
_version_ 1785095785204940800
author Fan, Xiongfei
Zhang, Hong
Zhang, Yu
author_facet Fan, Xiongfei
Zhang, Hong
Zhang, Yu
author_sort Fan, Xiongfei
collection PubMed
description Spiking neural networks (SNNs) are widely recognized for their biomimetic and efficient computing features. They utilize spikes to encode and transmit information. Despite the many advantages of SNNs, they suffer from the problems of low accuracy and large inference latency, which are, respectively, caused by the direct training and conversion from artificial neural network (ANN) training methods. Aiming to address these limitations, we propose a novel training pipeline (called IDSNN) based on parameter initialization and knowledge distillation, using ANN as a parameter source and teacher. IDSNN maximizes the knowledge extracted from ANNs and achieves competitive top-1 accuracy for CIFAR10 ([Formula: see text]) and CIFAR100 ([Formula: see text]) with low latency. More importantly, it can achieve [Formula: see text] faster convergence speed than directly training SNNs under limited training resources, which demonstrates its practical value in applications.
format Online
Article
Text
id pubmed-10452895
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-104528952023-08-26 IDSNN: Towards High-Performance and Low-Latency SNN Training via Initialization and Distillation Fan, Xiongfei Zhang, Hong Zhang, Yu Biomimetics (Basel) Article Spiking neural networks (SNNs) are widely recognized for their biomimetic and efficient computing features. They utilize spikes to encode and transmit information. Despite the many advantages of SNNs, they suffer from the problems of low accuracy and large inference latency, which are, respectively, caused by the direct training and conversion from artificial neural network (ANN) training methods. Aiming to address these limitations, we propose a novel training pipeline (called IDSNN) based on parameter initialization and knowledge distillation, using ANN as a parameter source and teacher. IDSNN maximizes the knowledge extracted from ANNs and achieves competitive top-1 accuracy for CIFAR10 ([Formula: see text]) and CIFAR100 ([Formula: see text]) with low latency. More importantly, it can achieve [Formula: see text] faster convergence speed than directly training SNNs under limited training resources, which demonstrates its practical value in applications. MDPI 2023-08-18 /pmc/articles/PMC10452895/ /pubmed/37622980 http://dx.doi.org/10.3390/biomimetics8040375 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Fan, Xiongfei
Zhang, Hong
Zhang, Yu
IDSNN: Towards High-Performance and Low-Latency SNN Training via Initialization and Distillation
title IDSNN: Towards High-Performance and Low-Latency SNN Training via Initialization and Distillation
title_full IDSNN: Towards High-Performance and Low-Latency SNN Training via Initialization and Distillation
title_fullStr IDSNN: Towards High-Performance and Low-Latency SNN Training via Initialization and Distillation
title_full_unstemmed IDSNN: Towards High-Performance and Low-Latency SNN Training via Initialization and Distillation
title_short IDSNN: Towards High-Performance and Low-Latency SNN Training via Initialization and Distillation
title_sort idsnn: towards high-performance and low-latency snn training via initialization and distillation
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10452895/
https://www.ncbi.nlm.nih.gov/pubmed/37622980
http://dx.doi.org/10.3390/biomimetics8040375
work_keys_str_mv AT fanxiongfei idsnntowardshighperformanceandlowlatencysnntrainingviainitializationanddistillation
AT zhanghong idsnntowardshighperformanceandlowlatencysnntrainingviainitializationanddistillation
AT zhangyu idsnntowardshighperformanceandlowlatencysnntrainingviainitializationanddistillation