Cargando…

Overview of Spiking Neural Network Learning Approaches and Their Computational Complexities

Spiking neural networks (SNNs) are subjects of a topic that is gaining more and more interest nowadays. They more closely resemble actual neural networks in the brain than their second-generation counterparts, artificial neural networks (ANNs). SNNs have the potential to be more energy efficient tha...

Descripción completa

Detalles Bibliográficos
Autores principales: Pietrzak, Paweł, Szczęsny, Szymon, Huderek, Damian, Przyborowski, Łukasz
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10053242/
https://www.ncbi.nlm.nih.gov/pubmed/36991750
http://dx.doi.org/10.3390/s23063037
_version_ 1785015366871678976
author Pietrzak, Paweł
Szczęsny, Szymon
Huderek, Damian
Przyborowski, Łukasz
author_facet Pietrzak, Paweł
Szczęsny, Szymon
Huderek, Damian
Przyborowski, Łukasz
author_sort Pietrzak, Paweł
collection PubMed
description Spiking neural networks (SNNs) are subjects of a topic that is gaining more and more interest nowadays. They more closely resemble actual neural networks in the brain than their second-generation counterparts, artificial neural networks (ANNs). SNNs have the potential to be more energy efficient than ANNs on event-driven neuromorphic hardware. This can yield drastic maintenance cost reduction for neural network models, as the energy consumption would be much lower in comparison to regular deep learning models hosted in the cloud today. However, such hardware is still not yet widely available. On standard computer architectures consisting mainly of central processing units (CPUs) and graphics processing units (GPUs) ANNs, due to simpler models of neurons and simpler models of connections between neurons, have the upper hand in terms of execution speed. In general, they also win in terms of learning algorithms, as SNNs do not reach the same levels of performance as their second-generation counterparts in typical machine learning benchmark tasks, such as classification. In this paper, we review existing learning algorithms for spiking neural networks, divide them into categories by type, and assess their computational complexity.
format Online
Article
Text
id pubmed-10053242
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-100532422023-03-30 Overview of Spiking Neural Network Learning Approaches and Their Computational Complexities Pietrzak, Paweł Szczęsny, Szymon Huderek, Damian Przyborowski, Łukasz Sensors (Basel) Review Spiking neural networks (SNNs) are subjects of a topic that is gaining more and more interest nowadays. They more closely resemble actual neural networks in the brain than their second-generation counterparts, artificial neural networks (ANNs). SNNs have the potential to be more energy efficient than ANNs on event-driven neuromorphic hardware. This can yield drastic maintenance cost reduction for neural network models, as the energy consumption would be much lower in comparison to regular deep learning models hosted in the cloud today. However, such hardware is still not yet widely available. On standard computer architectures consisting mainly of central processing units (CPUs) and graphics processing units (GPUs) ANNs, due to simpler models of neurons and simpler models of connections between neurons, have the upper hand in terms of execution speed. In general, they also win in terms of learning algorithms, as SNNs do not reach the same levels of performance as their second-generation counterparts in typical machine learning benchmark tasks, such as classification. In this paper, we review existing learning algorithms for spiking neural networks, divide them into categories by type, and assess their computational complexity. MDPI 2023-03-11 /pmc/articles/PMC10053242/ /pubmed/36991750 http://dx.doi.org/10.3390/s23063037 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Review
Pietrzak, Paweł
Szczęsny, Szymon
Huderek, Damian
Przyborowski, Łukasz
Overview of Spiking Neural Network Learning Approaches and Their Computational Complexities
title Overview of Spiking Neural Network Learning Approaches and Their Computational Complexities
title_full Overview of Spiking Neural Network Learning Approaches and Their Computational Complexities
title_fullStr Overview of Spiking Neural Network Learning Approaches and Their Computational Complexities
title_full_unstemmed Overview of Spiking Neural Network Learning Approaches and Their Computational Complexities
title_short Overview of Spiking Neural Network Learning Approaches and Their Computational Complexities
title_sort overview of spiking neural network learning approaches and their computational complexities
topic Review
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10053242/
https://www.ncbi.nlm.nih.gov/pubmed/36991750
http://dx.doi.org/10.3390/s23063037
work_keys_str_mv AT pietrzakpaweł overviewofspikingneuralnetworklearningapproachesandtheircomputationalcomplexities
AT szczesnyszymon overviewofspikingneuralnetworklearningapproachesandtheircomputationalcomplexities
AT huderekdamian overviewofspikingneuralnetworklearningapproachesandtheircomputationalcomplexities
AT przyborowskiłukasz overviewofspikingneuralnetworklearningapproachesandtheircomputationalcomplexities