Cargando…

Computational Efficiency of a Modular Reservoir Network for Image Recognition

Liquid state machine (LSM) is a type of recurrent spiking network with a strong relationship to neurophysiology and has achieved great success in time series processing. However, the computational cost of simulations and complex dynamics with time dependency limit the size and functionality of LSMs....

Descripción completa

Detalles Bibliográficos
Autores principales: Dai, Yifan, Yamamoto, Hideaki, Sakuraba, Masao, Sato, Shigeo
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7892762/
https://www.ncbi.nlm.nih.gov/pubmed/33613220
http://dx.doi.org/10.3389/fncom.2021.594337
_version_ 1783652915466993664
author Dai, Yifan
Yamamoto, Hideaki
Sakuraba, Masao
Sato, Shigeo
author_facet Dai, Yifan
Yamamoto, Hideaki
Sakuraba, Masao
Sato, Shigeo
author_sort Dai, Yifan
collection PubMed
description Liquid state machine (LSM) is a type of recurrent spiking network with a strong relationship to neurophysiology and has achieved great success in time series processing. However, the computational cost of simulations and complex dynamics with time dependency limit the size and functionality of LSMs. This paper presents a large-scale bioinspired LSM with modular topology. We integrate the findings on the visual cortex that specifically designed input synapses can fit the activation of the real cortex and perform the Hough transform, a feature extraction algorithm used in digital image processing, without additional cost. We experimentally verify that such a combination can significantly improve the network functionality. The network performance is evaluated using the MNIST dataset where the image data are encoded into spiking series by Poisson coding. We show that the proposed structure can not only significantly reduce the computational complexity but also achieve higher performance compared to the structure of previous reported networks of a similar size. We also show that the proposed structure has better robustness against system damage than the small-world and random structures. We believe that the proposed computationally efficient method can greatly contribute to future applications of reservoir computing.
format Online
Article
Text
id pubmed-7892762
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-78927622021-02-20 Computational Efficiency of a Modular Reservoir Network for Image Recognition Dai, Yifan Yamamoto, Hideaki Sakuraba, Masao Sato, Shigeo Front Comput Neurosci Neuroscience Liquid state machine (LSM) is a type of recurrent spiking network with a strong relationship to neurophysiology and has achieved great success in time series processing. However, the computational cost of simulations and complex dynamics with time dependency limit the size and functionality of LSMs. This paper presents a large-scale bioinspired LSM with modular topology. We integrate the findings on the visual cortex that specifically designed input synapses can fit the activation of the real cortex and perform the Hough transform, a feature extraction algorithm used in digital image processing, without additional cost. We experimentally verify that such a combination can significantly improve the network functionality. The network performance is evaluated using the MNIST dataset where the image data are encoded into spiking series by Poisson coding. We show that the proposed structure can not only significantly reduce the computational complexity but also achieve higher performance compared to the structure of previous reported networks of a similar size. We also show that the proposed structure has better robustness against system damage than the small-world and random structures. We believe that the proposed computationally efficient method can greatly contribute to future applications of reservoir computing. Frontiers Media S.A. 2021-02-05 /pmc/articles/PMC7892762/ /pubmed/33613220 http://dx.doi.org/10.3389/fncom.2021.594337 Text en Copyright © 2021 Dai, Yamamoto, Sakuraba and Sato. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Dai, Yifan
Yamamoto, Hideaki
Sakuraba, Masao
Sato, Shigeo
Computational Efficiency of a Modular Reservoir Network for Image Recognition
title Computational Efficiency of a Modular Reservoir Network for Image Recognition
title_full Computational Efficiency of a Modular Reservoir Network for Image Recognition
title_fullStr Computational Efficiency of a Modular Reservoir Network for Image Recognition
title_full_unstemmed Computational Efficiency of a Modular Reservoir Network for Image Recognition
title_short Computational Efficiency of a Modular Reservoir Network for Image Recognition
title_sort computational efficiency of a modular reservoir network for image recognition
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7892762/
https://www.ncbi.nlm.nih.gov/pubmed/33613220
http://dx.doi.org/10.3389/fncom.2021.594337
work_keys_str_mv AT daiyifan computationalefficiencyofamodularreservoirnetworkforimagerecognition
AT yamamotohideaki computationalefficiencyofamodularreservoirnetworkforimagerecognition
AT sakurabamasao computationalefficiencyofamodularreservoirnetworkforimagerecognition
AT satoshigeo computationalefficiencyofamodularreservoirnetworkforimagerecognition