Cargando…

Toward Scalable, Efficient, and Accurate Deep Spiking Neural Networks With Backward Residual Connections, Stochastic Softmax, and Hybridization

Spiking Neural Networks (SNNs) may offer an energy-efficient alternative for implementing deep learning applications. In recent years, there have been several proposals focused on supervised (conversion, spike-based gradient descent) and unsupervised (spike timing dependent plasticity) training meth...

Descripción completa

Detalles Bibliográficos
Autores principales: Panda, Priyadarshini, Aketi, Sai Aparna, Roy, Kaushik
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7339963/
https://www.ncbi.nlm.nih.gov/pubmed/32694977
http://dx.doi.org/10.3389/fnins.2020.00653
_version_ 1783554966825205760
author Panda, Priyadarshini
Aketi, Sai Aparna
Roy, Kaushik
author_facet Panda, Priyadarshini
Aketi, Sai Aparna
Roy, Kaushik
author_sort Panda, Priyadarshini
collection PubMed
description Spiking Neural Networks (SNNs) may offer an energy-efficient alternative for implementing deep learning applications. In recent years, there have been several proposals focused on supervised (conversion, spike-based gradient descent) and unsupervised (spike timing dependent plasticity) training methods to improve the accuracy of SNNs on large-scale tasks. However, each of these methods suffer from scalability, latency, and accuracy limitations. In this paper, we propose novel algorithmic techniques of modifying the SNN configuration with backward residual connections, stochastic softmax, and hybrid artificial-and-spiking neuronal activations to improve the learning ability of the training methodologies to yield competitive accuracy, while, yielding large efficiency gains over their artificial counterparts. Note, artificial counterparts refer to conventional deep learning/artificial neural networks. Our techniques apply to VGG/Residual architectures, and are compatible with all forms of training methodologies. Our analysis reveals that the proposed solutions yield near state-of-the-art accuracy with significant energy-efficiency and reduced parameter overhead translating to hardware improvements on complex visual recognition tasks, such as, CIFAR10, Imagenet datatsets.
format Online
Article
Text
id pubmed-7339963
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-73399632020-07-20 Toward Scalable, Efficient, and Accurate Deep Spiking Neural Networks With Backward Residual Connections, Stochastic Softmax, and Hybridization Panda, Priyadarshini Aketi, Sai Aparna Roy, Kaushik Front Neurosci Neuroscience Spiking Neural Networks (SNNs) may offer an energy-efficient alternative for implementing deep learning applications. In recent years, there have been several proposals focused on supervised (conversion, spike-based gradient descent) and unsupervised (spike timing dependent plasticity) training methods to improve the accuracy of SNNs on large-scale tasks. However, each of these methods suffer from scalability, latency, and accuracy limitations. In this paper, we propose novel algorithmic techniques of modifying the SNN configuration with backward residual connections, stochastic softmax, and hybrid artificial-and-spiking neuronal activations to improve the learning ability of the training methodologies to yield competitive accuracy, while, yielding large efficiency gains over their artificial counterparts. Note, artificial counterparts refer to conventional deep learning/artificial neural networks. Our techniques apply to VGG/Residual architectures, and are compatible with all forms of training methodologies. Our analysis reveals that the proposed solutions yield near state-of-the-art accuracy with significant energy-efficiency and reduced parameter overhead translating to hardware improvements on complex visual recognition tasks, such as, CIFAR10, Imagenet datatsets. Frontiers Media S.A. 2020-06-30 /pmc/articles/PMC7339963/ /pubmed/32694977 http://dx.doi.org/10.3389/fnins.2020.00653 Text en Copyright © 2020 Panda, Aketi and Roy. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Panda, Priyadarshini
Aketi, Sai Aparna
Roy, Kaushik
Toward Scalable, Efficient, and Accurate Deep Spiking Neural Networks With Backward Residual Connections, Stochastic Softmax, and Hybridization
title Toward Scalable, Efficient, and Accurate Deep Spiking Neural Networks With Backward Residual Connections, Stochastic Softmax, and Hybridization
title_full Toward Scalable, Efficient, and Accurate Deep Spiking Neural Networks With Backward Residual Connections, Stochastic Softmax, and Hybridization
title_fullStr Toward Scalable, Efficient, and Accurate Deep Spiking Neural Networks With Backward Residual Connections, Stochastic Softmax, and Hybridization
title_full_unstemmed Toward Scalable, Efficient, and Accurate Deep Spiking Neural Networks With Backward Residual Connections, Stochastic Softmax, and Hybridization
title_short Toward Scalable, Efficient, and Accurate Deep Spiking Neural Networks With Backward Residual Connections, Stochastic Softmax, and Hybridization
title_sort toward scalable, efficient, and accurate deep spiking neural networks with backward residual connections, stochastic softmax, and hybridization
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7339963/
https://www.ncbi.nlm.nih.gov/pubmed/32694977
http://dx.doi.org/10.3389/fnins.2020.00653
work_keys_str_mv AT pandapriyadarshini towardscalableefficientandaccuratedeepspikingneuralnetworkswithbackwardresidualconnectionsstochasticsoftmaxandhybridization
AT aketisaiaparna towardscalableefficientandaccuratedeepspikingneuralnetworkswithbackwardresidualconnectionsstochasticsoftmaxandhybridization
AT roykaushik towardscalableefficientandaccuratedeepspikingneuralnetworkswithbackwardresidualconnectionsstochasticsoftmaxandhybridization