Cargando…

AresB-Net: accurate residual binarized neural networks using shortcut concatenation and shuffled grouped convolution

This article proposes a novel network model to achieve better accurate residual binarized convolutional neural networks (CNNs), denoted as AresB-Net. Even though residual CNNs enhance the classification accuracy of binarized neural networks with increasing feature resolution, the degraded classifica...

Descripción completa

Detalles Bibliográficos
Autor principal: Kim, HyunJin
Formato: Online Artículo Texto
Lenguaje:English
Publicado: PeerJ Inc. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8022573/
https://www.ncbi.nlm.nih.gov/pubmed/33834112
http://dx.doi.org/10.7717/peerj-cs.454
_version_ 1783674958290878464
author Kim, HyunJin
author_facet Kim, HyunJin
author_sort Kim, HyunJin
collection PubMed
description This article proposes a novel network model to achieve better accurate residual binarized convolutional neural networks (CNNs), denoted as AresB-Net. Even though residual CNNs enhance the classification accuracy of binarized neural networks with increasing feature resolution, the degraded classification accuracy is still the primary concern compared with real-valued residual CNNs. AresB-Net consists of novel basic blocks to amortize the severe error from the binarization, suggesting a well-balanced pyramid structure without downsampling convolution. In each basic block, the shortcut is added to the convolution output and then concatenated, and then the expanded channels are shuffled for the next grouped convolution. In the downsampling when stride >1, our model adopts only the max-pooling layer for generating low-cost shortcut. This structure facilitates the feature reuse from the previous layers, thus alleviating the error from the binarized convolution and increasing the classification accuracy with reduced computational costs and small weight storage requirements. Despite low hardware costs from the binarized computations, the proposed model achieves remarkable classification accuracies on the CIFAR and ImageNet datasets.
format Online
Article
Text
id pubmed-8022573
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher PeerJ Inc.
record_format MEDLINE/PubMed
spelling pubmed-80225732021-04-07 AresB-Net: accurate residual binarized neural networks using shortcut concatenation and shuffled grouped convolution Kim, HyunJin PeerJ Comput Sci Artificial Intelligence This article proposes a novel network model to achieve better accurate residual binarized convolutional neural networks (CNNs), denoted as AresB-Net. Even though residual CNNs enhance the classification accuracy of binarized neural networks with increasing feature resolution, the degraded classification accuracy is still the primary concern compared with real-valued residual CNNs. AresB-Net consists of novel basic blocks to amortize the severe error from the binarization, suggesting a well-balanced pyramid structure without downsampling convolution. In each basic block, the shortcut is added to the convolution output and then concatenated, and then the expanded channels are shuffled for the next grouped convolution. In the downsampling when stride >1, our model adopts only the max-pooling layer for generating low-cost shortcut. This structure facilitates the feature reuse from the previous layers, thus alleviating the error from the binarized convolution and increasing the classification accuracy with reduced computational costs and small weight storage requirements. Despite low hardware costs from the binarized computations, the proposed model achieves remarkable classification accuracies on the CIFAR and ImageNet datasets. PeerJ Inc. 2021-03-26 /pmc/articles/PMC8022573/ /pubmed/33834112 http://dx.doi.org/10.7717/peerj-cs.454 Text en © 2021 Kim https://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, reproduction and adaptation in any medium and for any purpose provided that it is properly attributed. For attribution, the original author(s), title, publication source (PeerJ Computer Science) and either DOI or URL of the article must be cited.
spellingShingle Artificial Intelligence
Kim, HyunJin
AresB-Net: accurate residual binarized neural networks using shortcut concatenation and shuffled grouped convolution
title AresB-Net: accurate residual binarized neural networks using shortcut concatenation and shuffled grouped convolution
title_full AresB-Net: accurate residual binarized neural networks using shortcut concatenation and shuffled grouped convolution
title_fullStr AresB-Net: accurate residual binarized neural networks using shortcut concatenation and shuffled grouped convolution
title_full_unstemmed AresB-Net: accurate residual binarized neural networks using shortcut concatenation and shuffled grouped convolution
title_short AresB-Net: accurate residual binarized neural networks using shortcut concatenation and shuffled grouped convolution
title_sort aresb-net: accurate residual binarized neural networks using shortcut concatenation and shuffled grouped convolution
topic Artificial Intelligence
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8022573/
https://www.ncbi.nlm.nih.gov/pubmed/33834112
http://dx.doi.org/10.7717/peerj-cs.454
work_keys_str_mv AT kimhyunjin aresbnetaccurateresidualbinarizedneuralnetworksusingshortcutconcatenationandshuffledgroupedconvolution