Cargando…

Organizing Sequential Memory in a Neuromorphic Device Using Dynamic Neural Fields

Neuromorphic Very Large Scale Integration (VLSI) devices emulate the activation dynamics of biological neuronal networks using either mixed-signal analog/digital or purely digital electronic circuits. Using analog circuits in silicon to physically emulate the functionality of biological neurons and...

Descripción completa

Detalles Bibliográficos
Autores principales: Kreiser, Raphaela, Aathmani, Dora, Qiao, Ning, Indiveri, Giacomo, Sandamirskaya, Yulia
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6262404/
https://www.ncbi.nlm.nih.gov/pubmed/30524218
http://dx.doi.org/10.3389/fnins.2018.00717
_version_ 1783375098192855040
author Kreiser, Raphaela
Aathmani, Dora
Qiao, Ning
Indiveri, Giacomo
Sandamirskaya, Yulia
author_facet Kreiser, Raphaela
Aathmani, Dora
Qiao, Ning
Indiveri, Giacomo
Sandamirskaya, Yulia
author_sort Kreiser, Raphaela
collection PubMed
description Neuromorphic Very Large Scale Integration (VLSI) devices emulate the activation dynamics of biological neuronal networks using either mixed-signal analog/digital or purely digital electronic circuits. Using analog circuits in silicon to physically emulate the functionality of biological neurons and synapses enables faithful modeling of neural and synaptic dynamics at ultra low power consumption in real-time, and thus may serve as computational substrate for a new generation of efficient neural controllers for artificial intelligent systems. Although one of the main advantages of neural networks is their ability to perform on-line learning, only a small number of neuromorphic hardware devices implement this feature on-chip. In this work, we use a reconfigurable on-line learning spiking (ROLLS) neuromorphic processor chip to build a neuronal architecture for sequence learning. The proposed neuronal architecture uses the attractor properties of winner-takes-all (WTA) dynamics to cope with mismatch and noise in the ROLLS analog computing elements, and it uses its on-chip plasticity features to store sequences of states. We demonstrate, with a proof-of-concept feasibility study how this architecture can store, replay, and update sequences of states, induced by external inputs. Controlled by the attractor dynamics and an explicit destabilizing signal, the items in a sequence can last for varying amounts of time and thus reliable sequence learning and replay can be robustly implemented in a real sensorimotor system.
format Online
Article
Text
id pubmed-6262404
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-62624042018-12-06 Organizing Sequential Memory in a Neuromorphic Device Using Dynamic Neural Fields Kreiser, Raphaela Aathmani, Dora Qiao, Ning Indiveri, Giacomo Sandamirskaya, Yulia Front Neurosci Neuroscience Neuromorphic Very Large Scale Integration (VLSI) devices emulate the activation dynamics of biological neuronal networks using either mixed-signal analog/digital or purely digital electronic circuits. Using analog circuits in silicon to physically emulate the functionality of biological neurons and synapses enables faithful modeling of neural and synaptic dynamics at ultra low power consumption in real-time, and thus may serve as computational substrate for a new generation of efficient neural controllers for artificial intelligent systems. Although one of the main advantages of neural networks is their ability to perform on-line learning, only a small number of neuromorphic hardware devices implement this feature on-chip. In this work, we use a reconfigurable on-line learning spiking (ROLLS) neuromorphic processor chip to build a neuronal architecture for sequence learning. The proposed neuronal architecture uses the attractor properties of winner-takes-all (WTA) dynamics to cope with mismatch and noise in the ROLLS analog computing elements, and it uses its on-chip plasticity features to store sequences of states. We demonstrate, with a proof-of-concept feasibility study how this architecture can store, replay, and update sequences of states, induced by external inputs. Controlled by the attractor dynamics and an explicit destabilizing signal, the items in a sequence can last for varying amounts of time and thus reliable sequence learning and replay can be robustly implemented in a real sensorimotor system. Frontiers Media S.A. 2018-11-13 /pmc/articles/PMC6262404/ /pubmed/30524218 http://dx.doi.org/10.3389/fnins.2018.00717 Text en Copyright © 2018 Kreiser, Aathmani, Qiao, Indiveri and Sandamirskaya. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Kreiser, Raphaela
Aathmani, Dora
Qiao, Ning
Indiveri, Giacomo
Sandamirskaya, Yulia
Organizing Sequential Memory in a Neuromorphic Device Using Dynamic Neural Fields
title Organizing Sequential Memory in a Neuromorphic Device Using Dynamic Neural Fields
title_full Organizing Sequential Memory in a Neuromorphic Device Using Dynamic Neural Fields
title_fullStr Organizing Sequential Memory in a Neuromorphic Device Using Dynamic Neural Fields
title_full_unstemmed Organizing Sequential Memory in a Neuromorphic Device Using Dynamic Neural Fields
title_short Organizing Sequential Memory in a Neuromorphic Device Using Dynamic Neural Fields
title_sort organizing sequential memory in a neuromorphic device using dynamic neural fields
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6262404/
https://www.ncbi.nlm.nih.gov/pubmed/30524218
http://dx.doi.org/10.3389/fnins.2018.00717
work_keys_str_mv AT kreiserraphaela organizingsequentialmemoryinaneuromorphicdeviceusingdynamicneuralfields
AT aathmanidora organizingsequentialmemoryinaneuromorphicdeviceusingdynamicneuralfields
AT qiaoning organizingsequentialmemoryinaneuromorphicdeviceusingdynamicneuralfields
AT indiverigiacomo organizingsequentialmemoryinaneuromorphicdeviceusingdynamicneuralfields
AT sandamirskayayulia organizingsequentialmemoryinaneuromorphicdeviceusingdynamicneuralfields