Cargando…

Back-Propagation Operation for Analog Neural Network Hardware with Synapse Components Having Hysteresis Characteristics

To realize an analog artificial neural network hardware, the circuit element for synapse function is important because the number of synapse elements is much larger than that of neuron elements. One of the candidates for this synapse element is a ferroelectric memristor. This device functions as a v...

Descripción completa

Detalles Bibliográficos
Autores principales: Ueda, Michihito, Nishitani, Yu, Kaneko, Yukihiro, Omote, Atsushi
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2014
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4231062/
https://www.ncbi.nlm.nih.gov/pubmed/25393715
http://dx.doi.org/10.1371/journal.pone.0112659
_version_ 1782344374285238272
author Ueda, Michihito
Nishitani, Yu
Kaneko, Yukihiro
Omote, Atsushi
author_facet Ueda, Michihito
Nishitani, Yu
Kaneko, Yukihiro
Omote, Atsushi
author_sort Ueda, Michihito
collection PubMed
description To realize an analog artificial neural network hardware, the circuit element for synapse function is important because the number of synapse elements is much larger than that of neuron elements. One of the candidates for this synapse element is a ferroelectric memristor. This device functions as a voltage controllable variable resistor, which can be applied to a synapse weight. However, its conductance shows hysteresis characteristics and dispersion to the input voltage. Therefore, the conductance values vary according to the history of the height and the width of the applied pulse voltage. Due to the difficulty of controlling the accurate conductance, it is not easy to apply the back-propagation learning algorithm to the neural network hardware having memristor synapses. To solve this problem, we proposed and simulated a learning operation procedure as follows. Employing a weight perturbation technique, we derived the error change. When the error reduced, the next pulse voltage was updated according to the back-propagation learning algorithm. If the error increased the amplitude of the next voltage pulse was set in such way as to cause similar memristor conductance but in the opposite voltage scanning direction. By this operation, we could eliminate the hysteresis and confirmed that the simulation of the learning operation converged. We also adopted conductance dispersion numerically in the simulation. We examined the probability that the error decreased to a designated value within a predetermined loop number. The ferroelectric has the characteristics that the magnitude of polarization does not become smaller when voltages having the same polarity are applied. These characteristics greatly improved the probability even if the learning rate was small, if the magnitude of the dispersion is adequate. Because the dispersion of analog circuit elements is inevitable, this learning operation procedure is useful for analog neural network hardware.
format Online
Article
Text
id pubmed-4231062
institution National Center for Biotechnology Information
language English
publishDate 2014
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-42310622014-11-18 Back-Propagation Operation for Analog Neural Network Hardware with Synapse Components Having Hysteresis Characteristics Ueda, Michihito Nishitani, Yu Kaneko, Yukihiro Omote, Atsushi PLoS One Research Article To realize an analog artificial neural network hardware, the circuit element for synapse function is important because the number of synapse elements is much larger than that of neuron elements. One of the candidates for this synapse element is a ferroelectric memristor. This device functions as a voltage controllable variable resistor, which can be applied to a synapse weight. However, its conductance shows hysteresis characteristics and dispersion to the input voltage. Therefore, the conductance values vary according to the history of the height and the width of the applied pulse voltage. Due to the difficulty of controlling the accurate conductance, it is not easy to apply the back-propagation learning algorithm to the neural network hardware having memristor synapses. To solve this problem, we proposed and simulated a learning operation procedure as follows. Employing a weight perturbation technique, we derived the error change. When the error reduced, the next pulse voltage was updated according to the back-propagation learning algorithm. If the error increased the amplitude of the next voltage pulse was set in such way as to cause similar memristor conductance but in the opposite voltage scanning direction. By this operation, we could eliminate the hysteresis and confirmed that the simulation of the learning operation converged. We also adopted conductance dispersion numerically in the simulation. We examined the probability that the error decreased to a designated value within a predetermined loop number. The ferroelectric has the characteristics that the magnitude of polarization does not become smaller when voltages having the same polarity are applied. These characteristics greatly improved the probability even if the learning rate was small, if the magnitude of the dispersion is adequate. Because the dispersion of analog circuit elements is inevitable, this learning operation procedure is useful for analog neural network hardware. Public Library of Science 2014-11-13 /pmc/articles/PMC4231062/ /pubmed/25393715 http://dx.doi.org/10.1371/journal.pone.0112659 Text en © 2014 Ueda et al http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly credited.
spellingShingle Research Article
Ueda, Michihito
Nishitani, Yu
Kaneko, Yukihiro
Omote, Atsushi
Back-Propagation Operation for Analog Neural Network Hardware with Synapse Components Having Hysteresis Characteristics
title Back-Propagation Operation for Analog Neural Network Hardware with Synapse Components Having Hysteresis Characteristics
title_full Back-Propagation Operation for Analog Neural Network Hardware with Synapse Components Having Hysteresis Characteristics
title_fullStr Back-Propagation Operation for Analog Neural Network Hardware with Synapse Components Having Hysteresis Characteristics
title_full_unstemmed Back-Propagation Operation for Analog Neural Network Hardware with Synapse Components Having Hysteresis Characteristics
title_short Back-Propagation Operation for Analog Neural Network Hardware with Synapse Components Having Hysteresis Characteristics
title_sort back-propagation operation for analog neural network hardware with synapse components having hysteresis characteristics
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4231062/
https://www.ncbi.nlm.nih.gov/pubmed/25393715
http://dx.doi.org/10.1371/journal.pone.0112659
work_keys_str_mv AT uedamichihito backpropagationoperationforanalogneuralnetworkhardwarewithsynapsecomponentshavinghysteresischaracteristics
AT nishitaniyu backpropagationoperationforanalogneuralnetworkhardwarewithsynapsecomponentshavinghysteresischaracteristics
AT kanekoyukihiro backpropagationoperationforanalogneuralnetworkhardwarewithsynapsecomponentshavinghysteresischaracteristics
AT omoteatsushi backpropagationoperationforanalogneuralnetworkhardwarewithsynapsecomponentshavinghysteresischaracteristics