Cargando…
On-Chip Training Spiking Neural Networks Using Approximated Backpropagation With Analog Synaptic Devices
Hardware-based spiking neural networks (SNNs) inspired by a biological nervous system are regarded as an innovative computing system with very low power consumption and massively parallel operation. To train SNNs with supervision, we propose an efficient on-chip training scheme approximating backpro...
Autores principales: | , , , , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7358558/ https://www.ncbi.nlm.nih.gov/pubmed/32733180 http://dx.doi.org/10.3389/fnins.2020.00423 |
_version_ | 1783558866581061632 |
---|---|
author | Kwon, Dongseok Lim, Suhwan Bae, Jong-Ho Lee, Sung-Tae Kim, Hyeongsu Seo, Young-Tak Oh, Seongbin Kim, Jangsaeng Yeom, Kyuho Park, Byung-Gook Lee, Jong-Ho |
author_facet | Kwon, Dongseok Lim, Suhwan Bae, Jong-Ho Lee, Sung-Tae Kim, Hyeongsu Seo, Young-Tak Oh, Seongbin Kim, Jangsaeng Yeom, Kyuho Park, Byung-Gook Lee, Jong-Ho |
author_sort | Kwon, Dongseok |
collection | PubMed |
description | Hardware-based spiking neural networks (SNNs) inspired by a biological nervous system are regarded as an innovative computing system with very low power consumption and massively parallel operation. To train SNNs with supervision, we propose an efficient on-chip training scheme approximating backpropagation algorithm suitable for hardware implementation. We show that the accuracy of the proposed scheme for SNNs is close to that of conventional artificial neural networks (ANNs) by using the stochastic characteristics of neurons. In a hardware configuration, gated Schottky diodes (GSDs) are used as synaptic devices, which have a saturated current with respect to the input voltage. We design the SNN system by using the proposed on-chip training scheme with the GSDs, which can update their conductance in parallel to speed up the overall system. The performance of the on-chip training SNN system is validated through MNIST data set classification based on network size and total time step. The SNN systems achieve accuracy of 97.83% with 1 hidden layer and 98.44% with 4 hidden layers in fully connected neural networks. We then evaluate the effect of non-linearity and asymmetry of conductance response for long-term potentiation (LTP) and long-term depression (LTD) on the performance of the on-chip training SNN system. In addition, the impact of device variations on the performance of the on-chip training SNN system is evaluated. |
format | Online Article Text |
id | pubmed-7358558 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-73585582020-07-29 On-Chip Training Spiking Neural Networks Using Approximated Backpropagation With Analog Synaptic Devices Kwon, Dongseok Lim, Suhwan Bae, Jong-Ho Lee, Sung-Tae Kim, Hyeongsu Seo, Young-Tak Oh, Seongbin Kim, Jangsaeng Yeom, Kyuho Park, Byung-Gook Lee, Jong-Ho Front Neurosci Neuroscience Hardware-based spiking neural networks (SNNs) inspired by a biological nervous system are regarded as an innovative computing system with very low power consumption and massively parallel operation. To train SNNs with supervision, we propose an efficient on-chip training scheme approximating backpropagation algorithm suitable for hardware implementation. We show that the accuracy of the proposed scheme for SNNs is close to that of conventional artificial neural networks (ANNs) by using the stochastic characteristics of neurons. In a hardware configuration, gated Schottky diodes (GSDs) are used as synaptic devices, which have a saturated current with respect to the input voltage. We design the SNN system by using the proposed on-chip training scheme with the GSDs, which can update their conductance in parallel to speed up the overall system. The performance of the on-chip training SNN system is validated through MNIST data set classification based on network size and total time step. The SNN systems achieve accuracy of 97.83% with 1 hidden layer and 98.44% with 4 hidden layers in fully connected neural networks. We then evaluate the effect of non-linearity and asymmetry of conductance response for long-term potentiation (LTP) and long-term depression (LTD) on the performance of the on-chip training SNN system. In addition, the impact of device variations on the performance of the on-chip training SNN system is evaluated. Frontiers Media S.A. 2020-07-07 /pmc/articles/PMC7358558/ /pubmed/32733180 http://dx.doi.org/10.3389/fnins.2020.00423 Text en Copyright © 2020 Kwon, Lim, Bae, Lee, Kim, Seo, Oh, Kim, Yeom, Park and Lee. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Neuroscience Kwon, Dongseok Lim, Suhwan Bae, Jong-Ho Lee, Sung-Tae Kim, Hyeongsu Seo, Young-Tak Oh, Seongbin Kim, Jangsaeng Yeom, Kyuho Park, Byung-Gook Lee, Jong-Ho On-Chip Training Spiking Neural Networks Using Approximated Backpropagation With Analog Synaptic Devices |
title | On-Chip Training Spiking Neural Networks Using Approximated Backpropagation With Analog Synaptic Devices |
title_full | On-Chip Training Spiking Neural Networks Using Approximated Backpropagation With Analog Synaptic Devices |
title_fullStr | On-Chip Training Spiking Neural Networks Using Approximated Backpropagation With Analog Synaptic Devices |
title_full_unstemmed | On-Chip Training Spiking Neural Networks Using Approximated Backpropagation With Analog Synaptic Devices |
title_short | On-Chip Training Spiking Neural Networks Using Approximated Backpropagation With Analog Synaptic Devices |
title_sort | on-chip training spiking neural networks using approximated backpropagation with analog synaptic devices |
topic | Neuroscience |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7358558/ https://www.ncbi.nlm.nih.gov/pubmed/32733180 http://dx.doi.org/10.3389/fnins.2020.00423 |
work_keys_str_mv | AT kwondongseok onchiptrainingspikingneuralnetworksusingapproximatedbackpropagationwithanalogsynapticdevices AT limsuhwan onchiptrainingspikingneuralnetworksusingapproximatedbackpropagationwithanalogsynapticdevices AT baejongho onchiptrainingspikingneuralnetworksusingapproximatedbackpropagationwithanalogsynapticdevices AT leesungtae onchiptrainingspikingneuralnetworksusingapproximatedbackpropagationwithanalogsynapticdevices AT kimhyeongsu onchiptrainingspikingneuralnetworksusingapproximatedbackpropagationwithanalogsynapticdevices AT seoyoungtak onchiptrainingspikingneuralnetworksusingapproximatedbackpropagationwithanalogsynapticdevices AT ohseongbin onchiptrainingspikingneuralnetworksusingapproximatedbackpropagationwithanalogsynapticdevices AT kimjangsaeng onchiptrainingspikingneuralnetworksusingapproximatedbackpropagationwithanalogsynapticdevices AT yeomkyuho onchiptrainingspikingneuralnetworksusingapproximatedbackpropagationwithanalogsynapticdevices AT parkbyunggook onchiptrainingspikingneuralnetworksusingapproximatedbackpropagationwithanalogsynapticdevices AT leejongho onchiptrainingspikingneuralnetworksusingapproximatedbackpropagationwithanalogsynapticdevices |