Cargando…

Heuristic Method for Minimizing Model Size of CNN by Combining Multiple Pruning Techniques

Network pruning techniques have been widely used for compressing computational and memory intensive deep learning models through removing redundant components of the model. According to the pruning granularity, network pruning can be categorized into structured and unstructured methods. The structur...

Descripción completa

Detalles Bibliográficos
Autores principales: Tian, Danhe, Yamagiwa, Shinichi, Wada, Koichi
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9371397/
https://www.ncbi.nlm.nih.gov/pubmed/35957431
http://dx.doi.org/10.3390/s22155874
_version_ 1784767128197398528
author Tian, Danhe
Yamagiwa, Shinichi
Wada, Koichi
author_facet Tian, Danhe
Yamagiwa, Shinichi
Wada, Koichi
author_sort Tian, Danhe
collection PubMed
description Network pruning techniques have been widely used for compressing computational and memory intensive deep learning models through removing redundant components of the model. According to the pruning granularity, network pruning can be categorized into structured and unstructured methods. The structured pruning removes the large components in a model such as channels or layers, which might reduce the accuracy. The unstructured pruning directly removes mainly the parameters in a model as well as the redundant channels or layers, which might result in an inadequate pruning. To address the limitations of the pruning methods, this paper proposes a heuristic method for minimizing model size. This paper implements an algorithm to combine both the structured and the unstructured pruning methods while maintaining the target accuracy that is configured by its application. We use network slimming for the structured pruning method and deep compression for the unstructured one. Our method achieves a higher compression ratio than the case when the individual pruning method is applied. To show the effectiveness of our proposed method, this paper evaluates our proposed method with actual state-of-the-art CNN models of VGGNet, ResNet and DenseNet under the CIFAR-10 dataset. This paper discusses the performance of the proposed method with the cases of individual usage of the structured and unstructured pruning methods and then proves that our method achieves better performance with higher compression ratio. In the best case of the VGGNet, our method results in a 13× reduction ratio in the model size, and also gives a 15× reduction ratio regarding the pruning time compared with the brute-force search method.
format Online
Article
Text
id pubmed-9371397
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-93713972022-08-12 Heuristic Method for Minimizing Model Size of CNN by Combining Multiple Pruning Techniques Tian, Danhe Yamagiwa, Shinichi Wada, Koichi Sensors (Basel) Article Network pruning techniques have been widely used for compressing computational and memory intensive deep learning models through removing redundant components of the model. According to the pruning granularity, network pruning can be categorized into structured and unstructured methods. The structured pruning removes the large components in a model such as channels or layers, which might reduce the accuracy. The unstructured pruning directly removes mainly the parameters in a model as well as the redundant channels or layers, which might result in an inadequate pruning. To address the limitations of the pruning methods, this paper proposes a heuristic method for minimizing model size. This paper implements an algorithm to combine both the structured and the unstructured pruning methods while maintaining the target accuracy that is configured by its application. We use network slimming for the structured pruning method and deep compression for the unstructured one. Our method achieves a higher compression ratio than the case when the individual pruning method is applied. To show the effectiveness of our proposed method, this paper evaluates our proposed method with actual state-of-the-art CNN models of VGGNet, ResNet and DenseNet under the CIFAR-10 dataset. This paper discusses the performance of the proposed method with the cases of individual usage of the structured and unstructured pruning methods and then proves that our method achieves better performance with higher compression ratio. In the best case of the VGGNet, our method results in a 13× reduction ratio in the model size, and also gives a 15× reduction ratio regarding the pruning time compared with the brute-force search method. MDPI 2022-08-05 /pmc/articles/PMC9371397/ /pubmed/35957431 http://dx.doi.org/10.3390/s22155874 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Tian, Danhe
Yamagiwa, Shinichi
Wada, Koichi
Heuristic Method for Minimizing Model Size of CNN by Combining Multiple Pruning Techniques
title Heuristic Method for Minimizing Model Size of CNN by Combining Multiple Pruning Techniques
title_full Heuristic Method for Minimizing Model Size of CNN by Combining Multiple Pruning Techniques
title_fullStr Heuristic Method for Minimizing Model Size of CNN by Combining Multiple Pruning Techniques
title_full_unstemmed Heuristic Method for Minimizing Model Size of CNN by Combining Multiple Pruning Techniques
title_short Heuristic Method for Minimizing Model Size of CNN by Combining Multiple Pruning Techniques
title_sort heuristic method for minimizing model size of cnn by combining multiple pruning techniques
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9371397/
https://www.ncbi.nlm.nih.gov/pubmed/35957431
http://dx.doi.org/10.3390/s22155874
work_keys_str_mv AT tiandanhe heuristicmethodforminimizingmodelsizeofcnnbycombiningmultiplepruningtechniques
AT yamagiwashinichi heuristicmethodforminimizingmodelsizeofcnnbycombiningmultiplepruningtechniques
AT wadakoichi heuristicmethodforminimizingmodelsizeofcnnbycombiningmultiplepruningtechniques