Cargando…

Mapping the BCPNN Learning Rule to a Memristor Model

The Bayesian Confidence Propagation Neural Network (BCPNN) has been implemented in a way that allows mapping to neural and synaptic processes in the human cortexandhas been used extensively in detailed spiking models of cortical associative memory function and recently also for machine learning appl...

Descripción completa

Detalles Bibliográficos
Autores principales: Wang, Deyu, Xu, Jiawei, Stathis, Dimitrios, Zhang, Lianhao, Li, Feng, Lansner, Anders, Hemani, Ahmed, Yang, Yu, Herman, Pawel, Zou, Zhuo
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8695980/
https://www.ncbi.nlm.nih.gov/pubmed/34955716
http://dx.doi.org/10.3389/fnins.2021.750458
_version_ 1784619702750806016
author Wang, Deyu
Xu, Jiawei
Stathis, Dimitrios
Zhang, Lianhao
Li, Feng
Lansner, Anders
Hemani, Ahmed
Yang, Yu
Herman, Pawel
Zou, Zhuo
author_facet Wang, Deyu
Xu, Jiawei
Stathis, Dimitrios
Zhang, Lianhao
Li, Feng
Lansner, Anders
Hemani, Ahmed
Yang, Yu
Herman, Pawel
Zou, Zhuo
author_sort Wang, Deyu
collection PubMed
description The Bayesian Confidence Propagation Neural Network (BCPNN) has been implemented in a way that allows mapping to neural and synaptic processes in the human cortexandhas been used extensively in detailed spiking models of cortical associative memory function and recently also for machine learning applications. In conventional digital implementations of BCPNN, the von Neumann bottleneck is a major challenge with synaptic storage and access to it as the dominant cost. The memristor is a non-volatile device ideal for artificial synapses that fuses computation and storage and thus fundamentally overcomes the von Neumann bottleneck. While the implementation of other neural networks like Spiking Neural Network (SNN) and even Convolutional Neural Network (CNN) on memristor has been studied, the implementation of BCPNN has not. In this paper, the BCPNN learning rule is mapped to a memristor model and implemented with a memristor-based architecture. The implementation of the BCPNN learning rule is a mixed-signal design with the main computation and storage happening in the analog domain. In particular, the nonlinear dopant drift phenomenon of the memristor is exploited to simulate the exponential decay of the synaptic state variables in the BCPNN learning rule. The consistency between the memristor-based solution and the BCPNN learning rule is simulated and verified in Matlab, with a correlation coefficient as high as 0.99. The analog circuit is designed and implemented in the SPICE simulation environment, demonstrating a good emulation effect for the BCPNN learning rule with a correlation coefficient as high as 0.98. This work focuses on demonstrating the feasibility of mapping the BCPNN learning rule to in-circuit computation in memristor. The feasibility of the memristor-based implementation is evaluated and validated in the paper, to pave the way for a more efficient BCPNN implementation, toward a real-time brain emulation engine.
format Online
Article
Text
id pubmed-8695980
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-86959802021-12-24 Mapping the BCPNN Learning Rule to a Memristor Model Wang, Deyu Xu, Jiawei Stathis, Dimitrios Zhang, Lianhao Li, Feng Lansner, Anders Hemani, Ahmed Yang, Yu Herman, Pawel Zou, Zhuo Front Neurosci Neuroscience The Bayesian Confidence Propagation Neural Network (BCPNN) has been implemented in a way that allows mapping to neural and synaptic processes in the human cortexandhas been used extensively in detailed spiking models of cortical associative memory function and recently also for machine learning applications. In conventional digital implementations of BCPNN, the von Neumann bottleneck is a major challenge with synaptic storage and access to it as the dominant cost. The memristor is a non-volatile device ideal for artificial synapses that fuses computation and storage and thus fundamentally overcomes the von Neumann bottleneck. While the implementation of other neural networks like Spiking Neural Network (SNN) and even Convolutional Neural Network (CNN) on memristor has been studied, the implementation of BCPNN has not. In this paper, the BCPNN learning rule is mapped to a memristor model and implemented with a memristor-based architecture. The implementation of the BCPNN learning rule is a mixed-signal design with the main computation and storage happening in the analog domain. In particular, the nonlinear dopant drift phenomenon of the memristor is exploited to simulate the exponential decay of the synaptic state variables in the BCPNN learning rule. The consistency between the memristor-based solution and the BCPNN learning rule is simulated and verified in Matlab, with a correlation coefficient as high as 0.99. The analog circuit is designed and implemented in the SPICE simulation environment, demonstrating a good emulation effect for the BCPNN learning rule with a correlation coefficient as high as 0.98. This work focuses on demonstrating the feasibility of mapping the BCPNN learning rule to in-circuit computation in memristor. The feasibility of the memristor-based implementation is evaluated and validated in the paper, to pave the way for a more efficient BCPNN implementation, toward a real-time brain emulation engine. Frontiers Media S.A. 2021-12-09 /pmc/articles/PMC8695980/ /pubmed/34955716 http://dx.doi.org/10.3389/fnins.2021.750458 Text en Copyright © 2021 Wang, Xu, Stathis, Zhang, Li, Lansner, Hemani, Yang, Herman and Zou. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Wang, Deyu
Xu, Jiawei
Stathis, Dimitrios
Zhang, Lianhao
Li, Feng
Lansner, Anders
Hemani, Ahmed
Yang, Yu
Herman, Pawel
Zou, Zhuo
Mapping the BCPNN Learning Rule to a Memristor Model
title Mapping the BCPNN Learning Rule to a Memristor Model
title_full Mapping the BCPNN Learning Rule to a Memristor Model
title_fullStr Mapping the BCPNN Learning Rule to a Memristor Model
title_full_unstemmed Mapping the BCPNN Learning Rule to a Memristor Model
title_short Mapping the BCPNN Learning Rule to a Memristor Model
title_sort mapping the bcpnn learning rule to a memristor model
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8695980/
https://www.ncbi.nlm.nih.gov/pubmed/34955716
http://dx.doi.org/10.3389/fnins.2021.750458
work_keys_str_mv AT wangdeyu mappingthebcpnnlearningruletoamemristormodel
AT xujiawei mappingthebcpnnlearningruletoamemristormodel
AT stathisdimitrios mappingthebcpnnlearningruletoamemristormodel
AT zhanglianhao mappingthebcpnnlearningruletoamemristormodel
AT lifeng mappingthebcpnnlearningruletoamemristormodel
AT lansneranders mappingthebcpnnlearningruletoamemristormodel
AT hemaniahmed mappingthebcpnnlearningruletoamemristormodel
AT yangyu mappingthebcpnnlearningruletoamemristormodel
AT hermanpawel mappingthebcpnnlearningruletoamemristormodel
AT zouzhuo mappingthebcpnnlearningruletoamemristormodel