Cargando…

Generalised Analog LSTMs Recurrent Modules for Neural Computing

The human brain can be considered as a complex dynamic and recurrent neural network. There are several models for neural networks of the human brain, that cover sensory to cortical information processing. Large majority models include feedback mechanisms that are hard to formalise to realistic appli...

Descripción completa

Detalles Bibliográficos
Autores principales: Adam, Kazybek, Smagulova, Kamilya, James, Alex
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8506007/
https://www.ncbi.nlm.nih.gov/pubmed/34650420
http://dx.doi.org/10.3389/fncom.2021.705050
_version_ 1784581649600610304
author Adam, Kazybek
Smagulova, Kamilya
James, Alex
author_facet Adam, Kazybek
Smagulova, Kamilya
James, Alex
author_sort Adam, Kazybek
collection PubMed
description The human brain can be considered as a complex dynamic and recurrent neural network. There are several models for neural networks of the human brain, that cover sensory to cortical information processing. Large majority models include feedback mechanisms that are hard to formalise to realistic applications. Recurrent neural networks and Long short-term memory (LSTM) inspire from the neuronal feedback networks. Long short-term memory (LSTM) prevent vanishing and exploding gradients problems faced by simple recurrent neural networks and has the ability to process order-dependent data. Such recurrent neural units can be replicated in hardware and interfaced with analog sensors for efficient and miniaturised implementation of intelligent processing. Implementation of analog memristive LSTM hardware is an open research problem and can offer the advantages of continuous domain analog computing with relatively low on-chip area compared with a digital-only implementation. Designed for solving time-series prediction problems, overall architectures and circuits were tested with TSMC 0.18 μm CMOS technology and hafnium-oxide (HfO(2)) based memristor crossbars. Extensive circuit based SPICE simulations with over 3,500 (inference only) and 300 system-level simulations (training and inference) were performed for benchmarking the system performance of the proposed implementations. The analysis includes Monte Carlo simulations for the variability of memristors' conductance, and crossbar parasitic, where non-idealities of hybrid CMOS-memristor circuits are taken into the account.
format Online
Article
Text
id pubmed-8506007
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-85060072021-10-13 Generalised Analog LSTMs Recurrent Modules for Neural Computing Adam, Kazybek Smagulova, Kamilya James, Alex Front Comput Neurosci Neuroscience The human brain can be considered as a complex dynamic and recurrent neural network. There are several models for neural networks of the human brain, that cover sensory to cortical information processing. Large majority models include feedback mechanisms that are hard to formalise to realistic applications. Recurrent neural networks and Long short-term memory (LSTM) inspire from the neuronal feedback networks. Long short-term memory (LSTM) prevent vanishing and exploding gradients problems faced by simple recurrent neural networks and has the ability to process order-dependent data. Such recurrent neural units can be replicated in hardware and interfaced with analog sensors for efficient and miniaturised implementation of intelligent processing. Implementation of analog memristive LSTM hardware is an open research problem and can offer the advantages of continuous domain analog computing with relatively low on-chip area compared with a digital-only implementation. Designed for solving time-series prediction problems, overall architectures and circuits were tested with TSMC 0.18 μm CMOS technology and hafnium-oxide (HfO(2)) based memristor crossbars. Extensive circuit based SPICE simulations with over 3,500 (inference only) and 300 system-level simulations (training and inference) were performed for benchmarking the system performance of the proposed implementations. The analysis includes Monte Carlo simulations for the variability of memristors' conductance, and crossbar parasitic, where non-idealities of hybrid CMOS-memristor circuits are taken into the account. Frontiers Media S.A. 2021-09-28 /pmc/articles/PMC8506007/ /pubmed/34650420 http://dx.doi.org/10.3389/fncom.2021.705050 Text en Copyright © 2021 Adam, Smagulova and James. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Adam, Kazybek
Smagulova, Kamilya
James, Alex
Generalised Analog LSTMs Recurrent Modules for Neural Computing
title Generalised Analog LSTMs Recurrent Modules for Neural Computing
title_full Generalised Analog LSTMs Recurrent Modules for Neural Computing
title_fullStr Generalised Analog LSTMs Recurrent Modules for Neural Computing
title_full_unstemmed Generalised Analog LSTMs Recurrent Modules for Neural Computing
title_short Generalised Analog LSTMs Recurrent Modules for Neural Computing
title_sort generalised analog lstms recurrent modules for neural computing
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8506007/
https://www.ncbi.nlm.nih.gov/pubmed/34650420
http://dx.doi.org/10.3389/fncom.2021.705050
work_keys_str_mv AT adamkazybek generalisedanaloglstmsrecurrentmodulesforneuralcomputing
AT smagulovakamilya generalisedanaloglstmsrecurrentmodulesforneuralcomputing
AT jamesalex generalisedanaloglstmsrecurrentmodulesforneuralcomputing