Cargando…

Brain Inspired Sequences Production by Spiking Neural Networks With Reward-Modulated STDP

Understanding and producing embedded sequences according to supra-regular grammars in language has always been considered a high-level cognitive function of human beings, named “syntax barrier” between humans and animals. However, some neurologists recently showed that macaques could be trained to p...

Descripción completa

Detalles Bibliográficos
Autores principales: Fang, Hongjian, Zeng, Yi, Zhao, Feifei
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7921721/
https://www.ncbi.nlm.nih.gov/pubmed/33664661
http://dx.doi.org/10.3389/fncom.2021.612041
_version_ 1783658525707206656
author Fang, Hongjian
Zeng, Yi
Zhao, Feifei
author_facet Fang, Hongjian
Zeng, Yi
Zhao, Feifei
author_sort Fang, Hongjian
collection PubMed
description Understanding and producing embedded sequences according to supra-regular grammars in language has always been considered a high-level cognitive function of human beings, named “syntax barrier” between humans and animals. However, some neurologists recently showed that macaques could be trained to produce embedded sequences involving supra-regular grammars through a well-designed experiment paradigm. Via comparing macaques and preschool children's experimental results, they claimed that human uniqueness might only lie in the speed and learning strategy resulting from the chunking mechanism. Inspired by their research, we proposed a Brain-inspired Sequence Production Spiking Neural Network (SP-SNN) to model the same production process, followed by memory and learning mechanisms of the multi-brain region cooperation. After experimental verification, we demonstrated that SP-SNN could also handle embedded sequence production tasks, striding over the “syntax barrier.” SP-SNN used Population-Coding and STDP mechanism to realize working memory, Reward-Modulated STDP mechanism for acquiring supra-regular grammars. Therefore, SP-SNN needs to simultaneously coordinate short-term plasticity (STP) and long-term plasticity (LTP) mechanisms. Besides, we found that the chunking mechanism indeed makes a difference in improving our model's robustness. As far as we know, our work is the first one toward the “syntax barrier” in the SNN field, providing the computational foundation for further study of related underlying animals' neural mechanisms in the future.
format Online
Article
Text
id pubmed-7921721
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-79217212021-03-03 Brain Inspired Sequences Production by Spiking Neural Networks With Reward-Modulated STDP Fang, Hongjian Zeng, Yi Zhao, Feifei Front Comput Neurosci Neuroscience Understanding and producing embedded sequences according to supra-regular grammars in language has always been considered a high-level cognitive function of human beings, named “syntax barrier” between humans and animals. However, some neurologists recently showed that macaques could be trained to produce embedded sequences involving supra-regular grammars through a well-designed experiment paradigm. Via comparing macaques and preschool children's experimental results, they claimed that human uniqueness might only lie in the speed and learning strategy resulting from the chunking mechanism. Inspired by their research, we proposed a Brain-inspired Sequence Production Spiking Neural Network (SP-SNN) to model the same production process, followed by memory and learning mechanisms of the multi-brain region cooperation. After experimental verification, we demonstrated that SP-SNN could also handle embedded sequence production tasks, striding over the “syntax barrier.” SP-SNN used Population-Coding and STDP mechanism to realize working memory, Reward-Modulated STDP mechanism for acquiring supra-regular grammars. Therefore, SP-SNN needs to simultaneously coordinate short-term plasticity (STP) and long-term plasticity (LTP) mechanisms. Besides, we found that the chunking mechanism indeed makes a difference in improving our model's robustness. As far as we know, our work is the first one toward the “syntax barrier” in the SNN field, providing the computational foundation for further study of related underlying animals' neural mechanisms in the future. Frontiers Media S.A. 2021-02-16 /pmc/articles/PMC7921721/ /pubmed/33664661 http://dx.doi.org/10.3389/fncom.2021.612041 Text en Copyright © 2021 Fang, Zeng and Zhao. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Fang, Hongjian
Zeng, Yi
Zhao, Feifei
Brain Inspired Sequences Production by Spiking Neural Networks With Reward-Modulated STDP
title Brain Inspired Sequences Production by Spiking Neural Networks With Reward-Modulated STDP
title_full Brain Inspired Sequences Production by Spiking Neural Networks With Reward-Modulated STDP
title_fullStr Brain Inspired Sequences Production by Spiking Neural Networks With Reward-Modulated STDP
title_full_unstemmed Brain Inspired Sequences Production by Spiking Neural Networks With Reward-Modulated STDP
title_short Brain Inspired Sequences Production by Spiking Neural Networks With Reward-Modulated STDP
title_sort brain inspired sequences production by spiking neural networks with reward-modulated stdp
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7921721/
https://www.ncbi.nlm.nih.gov/pubmed/33664661
http://dx.doi.org/10.3389/fncom.2021.612041
work_keys_str_mv AT fanghongjian braininspiredsequencesproductionbyspikingneuralnetworkswithrewardmodulatedstdp
AT zengyi braininspiredsequencesproductionbyspikingneuralnetworkswithrewardmodulatedstdp
AT zhaofeifei braininspiredsequencesproductionbyspikingneuralnetworkswithrewardmodulatedstdp