Cargando…

Fast Convergence of Competitive Spiking Neural Networks with Sample-Based Weight Initialization

Recent work on spiking neural networks showed good progress towards unsupervised feature learning. In particular, networks called Competitive Spiking Neural Networks (CSNN) achieve reasonable accuracy in classification tasks. However, two major disadvantages limit their practical applications: high...

Descripción completa

Detalles Bibliográficos
Autores principales: Cachi, Paolo Gabriel, Ventura, Sebastián, Cios, Krzysztof Jozef
Formato: Online Artículo Texto
Lenguaje:English
Publicado: 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7274752/
http://dx.doi.org/10.1007/978-3-030-50153-2_57
_version_ 1783542653101539328
author Cachi, Paolo Gabriel
Ventura, Sebastián
Cios, Krzysztof Jozef
author_facet Cachi, Paolo Gabriel
Ventura, Sebastián
Cios, Krzysztof Jozef
author_sort Cachi, Paolo Gabriel
collection PubMed
description Recent work on spiking neural networks showed good progress towards unsupervised feature learning. In particular, networks called Competitive Spiking Neural Networks (CSNN) achieve reasonable accuracy in classification tasks. However, two major disadvantages limit their practical applications: high computational complexity and slow convergence. While the first problem has partially been addressed with the development of neuromorphic hardware, no work has addressed the latter problem. In this paper we show that the number of samples the CSNN needs to converge can be reduced significantly by a proposed new weight initialization. The proposed method uses input samples as initial values for the connection weights. Surprisingly, this simple initialization reduces the number of training samples needed for convergence by an order of magnitude without loss of accuracy. We use the MNIST dataset to show that the method is robust even when not all classes are seen during initialization.
format Online
Article
Text
id pubmed-7274752
institution National Center for Biotechnology Information
language English
publishDate 2020
record_format MEDLINE/PubMed
spelling pubmed-72747522020-06-08 Fast Convergence of Competitive Spiking Neural Networks with Sample-Based Weight Initialization Cachi, Paolo Gabriel Ventura, Sebastián Cios, Krzysztof Jozef Information Processing and Management of Uncertainty in Knowledge-Based Systems Article Recent work on spiking neural networks showed good progress towards unsupervised feature learning. In particular, networks called Competitive Spiking Neural Networks (CSNN) achieve reasonable accuracy in classification tasks. However, two major disadvantages limit their practical applications: high computational complexity and slow convergence. While the first problem has partially been addressed with the development of neuromorphic hardware, no work has addressed the latter problem. In this paper we show that the number of samples the CSNN needs to converge can be reduced significantly by a proposed new weight initialization. The proposed method uses input samples as initial values for the connection weights. Surprisingly, this simple initialization reduces the number of training samples needed for convergence by an order of magnitude without loss of accuracy. We use the MNIST dataset to show that the method is robust even when not all classes are seen during initialization. 2020-05-16 /pmc/articles/PMC7274752/ http://dx.doi.org/10.1007/978-3-030-50153-2_57 Text en © Springer Nature Switzerland AG 2020 This article is made available via the PMC Open Access Subset for unrestricted research re-use and secondary analysis in any form or by any means with acknowledgement of the original source. These permissions are granted for the duration of the World Health Organization (WHO) declaration of COVID-19 as a global pandemic.
spellingShingle Article
Cachi, Paolo Gabriel
Ventura, Sebastián
Cios, Krzysztof Jozef
Fast Convergence of Competitive Spiking Neural Networks with Sample-Based Weight Initialization
title Fast Convergence of Competitive Spiking Neural Networks with Sample-Based Weight Initialization
title_full Fast Convergence of Competitive Spiking Neural Networks with Sample-Based Weight Initialization
title_fullStr Fast Convergence of Competitive Spiking Neural Networks with Sample-Based Weight Initialization
title_full_unstemmed Fast Convergence of Competitive Spiking Neural Networks with Sample-Based Weight Initialization
title_short Fast Convergence of Competitive Spiking Neural Networks with Sample-Based Weight Initialization
title_sort fast convergence of competitive spiking neural networks with sample-based weight initialization
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7274752/
http://dx.doi.org/10.1007/978-3-030-50153-2_57
work_keys_str_mv AT cachipaologabriel fastconvergenceofcompetitivespikingneuralnetworkswithsamplebasedweightinitialization
AT venturasebastian fastconvergenceofcompetitivespikingneuralnetworkswithsamplebasedweightinitialization
AT cioskrzysztofjozef fastconvergenceofcompetitivespikingneuralnetworkswithsamplebasedweightinitialization