Cargando…
BlocTrain: Block-Wise Conditional Training and Inference for Efficient Spike-Based Deep Learning
Spiking neural networks (SNNs), with their inherent capability to learn sparse spike-based input representations over time, offer a promising solution for enabling the next generation of intelligent autonomous systems. Nevertheless, end-to-end training of deep SNNs is both compute- and memory-intens...
Autores principales: | Srinivasan, Gopalakrishnan, Roy, Kaushik |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8586528/ https://www.ncbi.nlm.nih.gov/pubmed/34776834 http://dx.doi.org/10.3389/fnins.2021.603433 |
Ejemplares similares
-
Training Deep Spiking Convolutional Neural Networks With STDP-Based Unsupervised Pre-training Followed by Supervised Fine-Tuning
por: Lee, Chankyu, et al.
Publicado: (2018) -
Enabling Spike-Based Backpropagation for Training Deep Neural Network Architectures
por: Lee, Chankyu, et al.
Publicado: (2020) -
ReStoCNet: Residual Stochastic Binary Convolutional Spiking Neural Network for Memory-Efficient Neuromorphic Computing
por: Srinivasan, Gopalakrishnan, et al.
Publicado: (2019) -
SpiLinC: Spiking Liquid-Ensemble Computing for Unsupervised Speech and Image Recognition
por: Srinivasan, Gopalakrishnan, et al.
Publicado: (2018) -
Training Deep Spiking Neural Networks Using Backpropagation
por: Lee, Jun Haeng, et al.
Publicado: (2016)