Cargando…

A storage-efficient ensemble classification using filter sharing on binarized convolutional neural networks

This paper proposes a storage-efficient ensemble classification to overcome the low inference accuracy of binary neural networks (BNNs). When external power is enough in a dynamic powered system, classification results can be enhanced by aggregating outputs of multiple BNN classifiers. However, memo...

Descripción completa

Detalles Bibliográficos
Autores principales: Kim, HyunJin, Alnemari, Mohammed, Bagherzadeh, Nader
Formato: Online Artículo Texto
Lenguaje:English
Publicado: PeerJ Inc. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9044348/
https://www.ncbi.nlm.nih.gov/pubmed/35494815
http://dx.doi.org/10.7717/peerj-cs.924
_version_ 1784695086276149248
author Kim, HyunJin
Alnemari, Mohammed
Bagherzadeh, Nader
author_facet Kim, HyunJin
Alnemari, Mohammed
Bagherzadeh, Nader
author_sort Kim, HyunJin
collection PubMed
description This paper proposes a storage-efficient ensemble classification to overcome the low inference accuracy of binary neural networks (BNNs). When external power is enough in a dynamic powered system, classification results can be enhanced by aggregating outputs of multiple BNN classifiers. However, memory requirements for storing multiple classifiers are a significant burden in the lightweight system. The proposed scheme shares the filters from a trained convolutional neural network (CNN) model to reduce storage requirements in the binarized CNNs instead of adopting the fully independent classifier. While several filters are shared, the proposed method only trains unfrozen learnable parameters in the retraining step. We compare and analyze the performances of the proposed ensemble-based systems depending on various ensemble types and BNN structures on CIFAR datasets. Our experiments conclude that the proposed method using the filter sharing can be scalable with the number of classifiers and effective in enhancing classification accuracy. With binarized ResNet-20 and ReActNet-10 on the CIFAR-100 dataset, the proposed scheme can achieve 56.74% and 70.29% Top-1 accuracies with 10 BNN classifiers, which enhances performance by 7.6% and 3.6% compared with that using a single BNN classifier.
format Online
Article
Text
id pubmed-9044348
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher PeerJ Inc.
record_format MEDLINE/PubMed
spelling pubmed-90443482022-04-28 A storage-efficient ensemble classification using filter sharing on binarized convolutional neural networks Kim, HyunJin Alnemari, Mohammed Bagherzadeh, Nader PeerJ Comput Sci Algorithms and Analysis of Algorithms This paper proposes a storage-efficient ensemble classification to overcome the low inference accuracy of binary neural networks (BNNs). When external power is enough in a dynamic powered system, classification results can be enhanced by aggregating outputs of multiple BNN classifiers. However, memory requirements for storing multiple classifiers are a significant burden in the lightweight system. The proposed scheme shares the filters from a trained convolutional neural network (CNN) model to reduce storage requirements in the binarized CNNs instead of adopting the fully independent classifier. While several filters are shared, the proposed method only trains unfrozen learnable parameters in the retraining step. We compare and analyze the performances of the proposed ensemble-based systems depending on various ensemble types and BNN structures on CIFAR datasets. Our experiments conclude that the proposed method using the filter sharing can be scalable with the number of classifiers and effective in enhancing classification accuracy. With binarized ResNet-20 and ReActNet-10 on the CIFAR-100 dataset, the proposed scheme can achieve 56.74% and 70.29% Top-1 accuracies with 10 BNN classifiers, which enhances performance by 7.6% and 3.6% compared with that using a single BNN classifier. PeerJ Inc. 2022-03-29 /pmc/articles/PMC9044348/ /pubmed/35494815 http://dx.doi.org/10.7717/peerj-cs.924 Text en © 2022 Kim et al. https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, reproduction and adaptation in any medium and for any purpose provided that it is properly attributed. For attribution, the original author(s), title, publication source (PeerJ Computer Science) and either DOI or URL of the article must be cited.
spellingShingle Algorithms and Analysis of Algorithms
Kim, HyunJin
Alnemari, Mohammed
Bagherzadeh, Nader
A storage-efficient ensemble classification using filter sharing on binarized convolutional neural networks
title A storage-efficient ensemble classification using filter sharing on binarized convolutional neural networks
title_full A storage-efficient ensemble classification using filter sharing on binarized convolutional neural networks
title_fullStr A storage-efficient ensemble classification using filter sharing on binarized convolutional neural networks
title_full_unstemmed A storage-efficient ensemble classification using filter sharing on binarized convolutional neural networks
title_short A storage-efficient ensemble classification using filter sharing on binarized convolutional neural networks
title_sort storage-efficient ensemble classification using filter sharing on binarized convolutional neural networks
topic Algorithms and Analysis of Algorithms
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9044348/
https://www.ncbi.nlm.nih.gov/pubmed/35494815
http://dx.doi.org/10.7717/peerj-cs.924
work_keys_str_mv AT kimhyunjin astorageefficientensembleclassificationusingfiltersharingonbinarizedconvolutionalneuralnetworks
AT alnemarimohammed astorageefficientensembleclassificationusingfiltersharingonbinarizedconvolutionalneuralnetworks
AT bagherzadehnader astorageefficientensembleclassificationusingfiltersharingonbinarizedconvolutionalneuralnetworks
AT kimhyunjin storageefficientensembleclassificationusingfiltersharingonbinarizedconvolutionalneuralnetworks
AT alnemarimohammed storageefficientensembleclassificationusingfiltersharingonbinarizedconvolutionalneuralnetworks
AT bagherzadehnader storageefficientensembleclassificationusingfiltersharingonbinarizedconvolutionalneuralnetworks