Cargando…

An adiabatic method to train binarized artificial neural networks

An artificial neural network consists of neurons and synapses. Neuron gives output based on its input according to non-linear activation functions such as the Sigmoid, Hyperbolic Tangent (Tanh), or Rectified Linear Unit (ReLU) functions, etc.. Synapses connect the neuron outputs to their inputs with...

Descripción completa

Detalles Bibliográficos
Autores principales: Zhao, Yuansheng, Xiao, Jiang
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8492711/
https://www.ncbi.nlm.nih.gov/pubmed/34611220
http://dx.doi.org/10.1038/s41598-021-99191-2
_version_ 1784578976788774912
author Zhao, Yuansheng
Xiao, Jiang
author_facet Zhao, Yuansheng
Xiao, Jiang
author_sort Zhao, Yuansheng
collection PubMed
description An artificial neural network consists of neurons and synapses. Neuron gives output based on its input according to non-linear activation functions such as the Sigmoid, Hyperbolic Tangent (Tanh), or Rectified Linear Unit (ReLU) functions, etc.. Synapses connect the neuron outputs to their inputs with tunable real-valued weights. The most resource-demanding operations in realizing such neural networks are the multiplication and accumulate (MAC) operations that compute the dot product between real-valued outputs from neurons and the synapses weights. The efficiency of neural networks can be drastically enhanced if the neuron outputs and/or the weights can be trained to take binary values [Formula: see text] only, for which the MAC can be replaced by the simple XNOR operations. In this paper, we demonstrate an adiabatic training method that can binarize the fully-connected neural networks and the convolutional neural networks without modifying the network structure and size. This adiabatic training method only requires very minimal changes in training algorithms, and is tested in the following four tasks: the recognition of hand-writing numbers using a usual fully-connected network, the cat-dog recognition and the audio recognition using convolutional neural networks, the image recognition with 10 classes (CIFAR-10) using ResNet-20 and VGG-Small networks. In all tasks, the performance of the binary neural networks trained by the adiabatic method are almost identical to the networks trained using the conventional ReLU or Sigmoid activations with real-valued activations and weights. This adiabatic method can be easily applied to binarize different types of networks, and will increase the computational efficiency considerably and greatly simplify the deployment of neural networks.
format Online
Article
Text
id pubmed-8492711
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-84927112021-10-07 An adiabatic method to train binarized artificial neural networks Zhao, Yuansheng Xiao, Jiang Sci Rep Article An artificial neural network consists of neurons and synapses. Neuron gives output based on its input according to non-linear activation functions such as the Sigmoid, Hyperbolic Tangent (Tanh), or Rectified Linear Unit (ReLU) functions, etc.. Synapses connect the neuron outputs to their inputs with tunable real-valued weights. The most resource-demanding operations in realizing such neural networks are the multiplication and accumulate (MAC) operations that compute the dot product between real-valued outputs from neurons and the synapses weights. The efficiency of neural networks can be drastically enhanced if the neuron outputs and/or the weights can be trained to take binary values [Formula: see text] only, for which the MAC can be replaced by the simple XNOR operations. In this paper, we demonstrate an adiabatic training method that can binarize the fully-connected neural networks and the convolutional neural networks without modifying the network structure and size. This adiabatic training method only requires very minimal changes in training algorithms, and is tested in the following four tasks: the recognition of hand-writing numbers using a usual fully-connected network, the cat-dog recognition and the audio recognition using convolutional neural networks, the image recognition with 10 classes (CIFAR-10) using ResNet-20 and VGG-Small networks. In all tasks, the performance of the binary neural networks trained by the adiabatic method are almost identical to the networks trained using the conventional ReLU or Sigmoid activations with real-valued activations and weights. This adiabatic method can be easily applied to binarize different types of networks, and will increase the computational efficiency considerably and greatly simplify the deployment of neural networks. Nature Publishing Group UK 2021-10-05 /pmc/articles/PMC8492711/ /pubmed/34611220 http://dx.doi.org/10.1038/s41598-021-99191-2 Text en © The Author(s) 2021 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Zhao, Yuansheng
Xiao, Jiang
An adiabatic method to train binarized artificial neural networks
title An adiabatic method to train binarized artificial neural networks
title_full An adiabatic method to train binarized artificial neural networks
title_fullStr An adiabatic method to train binarized artificial neural networks
title_full_unstemmed An adiabatic method to train binarized artificial neural networks
title_short An adiabatic method to train binarized artificial neural networks
title_sort adiabatic method to train binarized artificial neural networks
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8492711/
https://www.ncbi.nlm.nih.gov/pubmed/34611220
http://dx.doi.org/10.1038/s41598-021-99191-2
work_keys_str_mv AT zhaoyuansheng anadiabaticmethodtotrainbinarizedartificialneuralnetworks
AT xiaojiang anadiabaticmethodtotrainbinarizedartificialneuralnetworks
AT zhaoyuansheng adiabaticmethodtotrainbinarizedartificialneuralnetworks
AT xiaojiang adiabaticmethodtotrainbinarizedartificialneuralnetworks