Cargando…

Multi-neuron connection using multi-terminal floating–gate memristor for unsupervised learning

Multi-terminal memristor and memtransistor (MT-MEMs) has successfully performed complex functions of heterosynaptic plasticity in synapse. However, theses MT-MEMs lack the ability to emulate membrane potential of neuron in multiple neuronal connections. Here, we demonstrate multi-neuron connection u...

Descripción completa

Detalles Bibliográficos
Autores principales: Won, Ui Yeon, An Vu, Quoc, Park, Sung Bum, Park, Mi Hyang, Dam Do, Van, Park, Hyun Jun, Yang, Heejun, Lee, Young Hee, Yu, Woo Jong
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10224934/
https://www.ncbi.nlm.nih.gov/pubmed/37244897
http://dx.doi.org/10.1038/s41467-023-38667-3
_version_ 1785050297926680576
author Won, Ui Yeon
An Vu, Quoc
Park, Sung Bum
Park, Mi Hyang
Dam Do, Van
Park, Hyun Jun
Yang, Heejun
Lee, Young Hee
Yu, Woo Jong
author_facet Won, Ui Yeon
An Vu, Quoc
Park, Sung Bum
Park, Mi Hyang
Dam Do, Van
Park, Hyun Jun
Yang, Heejun
Lee, Young Hee
Yu, Woo Jong
author_sort Won, Ui Yeon
collection PubMed
description Multi-terminal memristor and memtransistor (MT-MEMs) has successfully performed complex functions of heterosynaptic plasticity in synapse. However, theses MT-MEMs lack the ability to emulate membrane potential of neuron in multiple neuronal connections. Here, we demonstrate multi-neuron connection using a multi-terminal floating-gate memristor (MT-FGMEM). The variable Fermi level (E(F)) in graphene allows charging and discharging of MT-FGMEM using horizontally distant multiple electrodes. Our MT-FGMEM demonstrates high on/off ratio over 10(5) at 1000 s retention about ~10,000 times higher than other MT-MEMs. The linear behavior between current (I(D)) and floating gate potential (V(FG)) in triode region of MT-FGMEM allows for accurate spike integration at the neuron membrane. The MT-FGMEM fully mimics the temporal and spatial summation of multi-neuron connections based on leaky-integrate-and-fire (LIF) functionality. Our artificial neuron (150 pJ) significantly reduces the energy consumption by 100,000 times compared to conventional neurons based on silicon integrated circuits (11.7 μJ). By integrating neurons and synapses using MT-FGMEMs, a spiking neurosynaptic training and classification of directional lines functioned in visual area one (V1) is successfully emulated based on neuron’s LIF and synapse’s spike-timing-dependent plasticity (STDP) functions. Simulation of unsupervised learning based on our artificial neuron and synapse achieves a learning accuracy of 83.08% on the unlabeled MNIST handwritten dataset.
format Online
Article
Text
id pubmed-10224934
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-102249342023-05-29 Multi-neuron connection using multi-terminal floating–gate memristor for unsupervised learning Won, Ui Yeon An Vu, Quoc Park, Sung Bum Park, Mi Hyang Dam Do, Van Park, Hyun Jun Yang, Heejun Lee, Young Hee Yu, Woo Jong Nat Commun Article Multi-terminal memristor and memtransistor (MT-MEMs) has successfully performed complex functions of heterosynaptic plasticity in synapse. However, theses MT-MEMs lack the ability to emulate membrane potential of neuron in multiple neuronal connections. Here, we demonstrate multi-neuron connection using a multi-terminal floating-gate memristor (MT-FGMEM). The variable Fermi level (E(F)) in graphene allows charging and discharging of MT-FGMEM using horizontally distant multiple electrodes. Our MT-FGMEM demonstrates high on/off ratio over 10(5) at 1000 s retention about ~10,000 times higher than other MT-MEMs. The linear behavior between current (I(D)) and floating gate potential (V(FG)) in triode region of MT-FGMEM allows for accurate spike integration at the neuron membrane. The MT-FGMEM fully mimics the temporal and spatial summation of multi-neuron connections based on leaky-integrate-and-fire (LIF) functionality. Our artificial neuron (150 pJ) significantly reduces the energy consumption by 100,000 times compared to conventional neurons based on silicon integrated circuits (11.7 μJ). By integrating neurons and synapses using MT-FGMEMs, a spiking neurosynaptic training and classification of directional lines functioned in visual area one (V1) is successfully emulated based on neuron’s LIF and synapse’s spike-timing-dependent plasticity (STDP) functions. Simulation of unsupervised learning based on our artificial neuron and synapse achieves a learning accuracy of 83.08% on the unlabeled MNIST handwritten dataset. Nature Publishing Group UK 2023-05-27 /pmc/articles/PMC10224934/ /pubmed/37244897 http://dx.doi.org/10.1038/s41467-023-38667-3 Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Won, Ui Yeon
An Vu, Quoc
Park, Sung Bum
Park, Mi Hyang
Dam Do, Van
Park, Hyun Jun
Yang, Heejun
Lee, Young Hee
Yu, Woo Jong
Multi-neuron connection using multi-terminal floating–gate memristor for unsupervised learning
title Multi-neuron connection using multi-terminal floating–gate memristor for unsupervised learning
title_full Multi-neuron connection using multi-terminal floating–gate memristor for unsupervised learning
title_fullStr Multi-neuron connection using multi-terminal floating–gate memristor for unsupervised learning
title_full_unstemmed Multi-neuron connection using multi-terminal floating–gate memristor for unsupervised learning
title_short Multi-neuron connection using multi-terminal floating–gate memristor for unsupervised learning
title_sort multi-neuron connection using multi-terminal floating–gate memristor for unsupervised learning
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10224934/
https://www.ncbi.nlm.nih.gov/pubmed/37244897
http://dx.doi.org/10.1038/s41467-023-38667-3
work_keys_str_mv AT wonuiyeon multineuronconnectionusingmultiterminalfloatinggatememristorforunsupervisedlearning
AT anvuquoc multineuronconnectionusingmultiterminalfloatinggatememristorforunsupervisedlearning
AT parksungbum multineuronconnectionusingmultiterminalfloatinggatememristorforunsupervisedlearning
AT parkmihyang multineuronconnectionusingmultiterminalfloatinggatememristorforunsupervisedlearning
AT damdovan multineuronconnectionusingmultiterminalfloatinggatememristorforunsupervisedlearning
AT parkhyunjun multineuronconnectionusingmultiterminalfloatinggatememristorforunsupervisedlearning
AT yangheejun multineuronconnectionusingmultiterminalfloatinggatememristorforunsupervisedlearning
AT leeyounghee multineuronconnectionusingmultiterminalfloatinggatememristorforunsupervisedlearning
AT yuwoojong multineuronconnectionusingmultiterminalfloatinggatememristorforunsupervisedlearning