Cargando…

Efficient training of spiking neural networks with temporally-truncated local backpropagation through time

Directly training spiking neural networks (SNNs) has remained challenging due to complex neural dynamics and intrinsic non-differentiability in firing functions. The well-known backpropagation through time (BPTT) algorithm proposed to train SNNs suffers from large memory footprint and prohibits back...

Descripción completa

Detalles Bibliográficos
Autores principales: Guo, Wenzhe, Fouda, Mohammed E., Eltawil, Ahmed M., Salama, Khaled Nabil
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10117667/
https://www.ncbi.nlm.nih.gov/pubmed/37090791
http://dx.doi.org/10.3389/fnins.2023.1047008
_version_ 1785028641056358400
author Guo, Wenzhe
Fouda, Mohammed E.
Eltawil, Ahmed M.
Salama, Khaled Nabil
author_facet Guo, Wenzhe
Fouda, Mohammed E.
Eltawil, Ahmed M.
Salama, Khaled Nabil
author_sort Guo, Wenzhe
collection PubMed
description Directly training spiking neural networks (SNNs) has remained challenging due to complex neural dynamics and intrinsic non-differentiability in firing functions. The well-known backpropagation through time (BPTT) algorithm proposed to train SNNs suffers from large memory footprint and prohibits backward and update unlocking, making it impossible to exploit the potential of locally-supervised training methods. This work proposes an efficient and direct training algorithm for SNNs that integrates a locally-supervised training method with a temporally-truncated BPTT algorithm. The proposed algorithm explores both temporal and spatial locality in BPTT and contributes to significant reduction in computational cost including GPU memory utilization, main memory access and arithmetic operations. We thoroughly explore the design space concerning temporal truncation length and local training block size and benchmark their impact on classification accuracy of different networks running different types of tasks. The results reveal that temporal truncation has a negative effect on the accuracy of classifying frame-based datasets, but leads to improvement in accuracy on event-based datasets. In spite of resulting information loss, local training is capable of alleviating overfitting. The combined effect of temporal truncation and local training can lead to the slowdown of accuracy drop and even improvement in accuracy. In addition, training deep SNNs' models such as AlexNet classifying CIFAR10-DVS dataset leads to 7.26% increase in accuracy, 89.94% reduction in GPU memory, 10.79% reduction in memory access, and 99.64% reduction in MAC operations compared to the standard end-to-end BPTT. Thus, the proposed method has shown high potential to enable fast and energy-efficient on-chip training for real-time learning at the edge.
format Online
Article
Text
id pubmed-10117667
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-101176672023-04-21 Efficient training of spiking neural networks with temporally-truncated local backpropagation through time Guo, Wenzhe Fouda, Mohammed E. Eltawil, Ahmed M. Salama, Khaled Nabil Front Neurosci Neuroscience Directly training spiking neural networks (SNNs) has remained challenging due to complex neural dynamics and intrinsic non-differentiability in firing functions. The well-known backpropagation through time (BPTT) algorithm proposed to train SNNs suffers from large memory footprint and prohibits backward and update unlocking, making it impossible to exploit the potential of locally-supervised training methods. This work proposes an efficient and direct training algorithm for SNNs that integrates a locally-supervised training method with a temporally-truncated BPTT algorithm. The proposed algorithm explores both temporal and spatial locality in BPTT and contributes to significant reduction in computational cost including GPU memory utilization, main memory access and arithmetic operations. We thoroughly explore the design space concerning temporal truncation length and local training block size and benchmark their impact on classification accuracy of different networks running different types of tasks. The results reveal that temporal truncation has a negative effect on the accuracy of classifying frame-based datasets, but leads to improvement in accuracy on event-based datasets. In spite of resulting information loss, local training is capable of alleviating overfitting. The combined effect of temporal truncation and local training can lead to the slowdown of accuracy drop and even improvement in accuracy. In addition, training deep SNNs' models such as AlexNet classifying CIFAR10-DVS dataset leads to 7.26% increase in accuracy, 89.94% reduction in GPU memory, 10.79% reduction in memory access, and 99.64% reduction in MAC operations compared to the standard end-to-end BPTT. Thus, the proposed method has shown high potential to enable fast and energy-efficient on-chip training for real-time learning at the edge. Frontiers Media S.A. 2023-04-06 /pmc/articles/PMC10117667/ /pubmed/37090791 http://dx.doi.org/10.3389/fnins.2023.1047008 Text en Copyright © 2023 Guo, Fouda, Eltawil and Salama. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Guo, Wenzhe
Fouda, Mohammed E.
Eltawil, Ahmed M.
Salama, Khaled Nabil
Efficient training of spiking neural networks with temporally-truncated local backpropagation through time
title Efficient training of spiking neural networks with temporally-truncated local backpropagation through time
title_full Efficient training of spiking neural networks with temporally-truncated local backpropagation through time
title_fullStr Efficient training of spiking neural networks with temporally-truncated local backpropagation through time
title_full_unstemmed Efficient training of spiking neural networks with temporally-truncated local backpropagation through time
title_short Efficient training of spiking neural networks with temporally-truncated local backpropagation through time
title_sort efficient training of spiking neural networks with temporally-truncated local backpropagation through time
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10117667/
https://www.ncbi.nlm.nih.gov/pubmed/37090791
http://dx.doi.org/10.3389/fnins.2023.1047008
work_keys_str_mv AT guowenzhe efficienttrainingofspikingneuralnetworkswithtemporallytruncatedlocalbackpropagationthroughtime
AT foudamohammede efficienttrainingofspikingneuralnetworkswithtemporallytruncatedlocalbackpropagationthroughtime
AT eltawilahmedm efficienttrainingofspikingneuralnetworkswithtemporallytruncatedlocalbackpropagationthroughtime
AT salamakhalednabil efficienttrainingofspikingneuralnetworkswithtemporallytruncatedlocalbackpropagationthroughtime