Cargando…

Filter Pruning via Measuring Feature Map Information

Neural network pruning, an important method to reduce the computational complexity of deep models, can be well applied to devices with limited resources. However, most current methods focus on some kind of information about the filter itself to prune the network, rarely exploring the relationship be...

Descripción completa

Detalles Bibliográficos
Autores principales: Shao, Linsong, Zuo, Haorui, Zhang, Jianlin, Xu, Zhiyong, Yao, Jinzhen, Wang, Zhixing, Li, Hong
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8512244/
https://www.ncbi.nlm.nih.gov/pubmed/34640921
http://dx.doi.org/10.3390/s21196601
_version_ 1784582944636010496
author Shao, Linsong
Zuo, Haorui
Zhang, Jianlin
Xu, Zhiyong
Yao, Jinzhen
Wang, Zhixing
Li, Hong
author_facet Shao, Linsong
Zuo, Haorui
Zhang, Jianlin
Xu, Zhiyong
Yao, Jinzhen
Wang, Zhixing
Li, Hong
author_sort Shao, Linsong
collection PubMed
description Neural network pruning, an important method to reduce the computational complexity of deep models, can be well applied to devices with limited resources. However, most current methods focus on some kind of information about the filter itself to prune the network, rarely exploring the relationship between the feature maps and the filters. In this paper, two novel pruning methods are proposed. First, a new pruning method is proposed, which reflects the importance of filters by exploring the information in the feature maps. Based on the premise that the more information there is, more important the feature map is, the information entropy of feature maps is used to measure information, which is used to evaluate the importance of each filter in the current layer. Further, normalization is used to realize cross layer comparison. As a result, based on the method mentioned above, the network structure is efficiently pruned while its performance is well reserved. Second, we proposed a parallel pruning method using the combination of our pruning method above and slimming pruning method which has better results in terms of computational cost. Our methods perform better in terms of accuracy, parameters, and FLOPs compared to most advanced methods. On ImageNet, it is achieved 72.02% top1 accuracy for ResNet50 with merely 11.41M parameters and 1.12B FLOPs.For DenseNet40, it is obtained 94.04% accuracy with only 0.38M parameters and 110.72M FLOPs on CIFAR10, and our parallel pruning method makes the parameters and FLOPs are just 0.37M and 100.12M, respectively, with little loss of accuracy.
format Online
Article
Text
id pubmed-8512244
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-85122442021-10-14 Filter Pruning via Measuring Feature Map Information Shao, Linsong Zuo, Haorui Zhang, Jianlin Xu, Zhiyong Yao, Jinzhen Wang, Zhixing Li, Hong Sensors (Basel) Article Neural network pruning, an important method to reduce the computational complexity of deep models, can be well applied to devices with limited resources. However, most current methods focus on some kind of information about the filter itself to prune the network, rarely exploring the relationship between the feature maps and the filters. In this paper, two novel pruning methods are proposed. First, a new pruning method is proposed, which reflects the importance of filters by exploring the information in the feature maps. Based on the premise that the more information there is, more important the feature map is, the information entropy of feature maps is used to measure information, which is used to evaluate the importance of each filter in the current layer. Further, normalization is used to realize cross layer comparison. As a result, based on the method mentioned above, the network structure is efficiently pruned while its performance is well reserved. Second, we proposed a parallel pruning method using the combination of our pruning method above and slimming pruning method which has better results in terms of computational cost. Our methods perform better in terms of accuracy, parameters, and FLOPs compared to most advanced methods. On ImageNet, it is achieved 72.02% top1 accuracy for ResNet50 with merely 11.41M parameters and 1.12B FLOPs.For DenseNet40, it is obtained 94.04% accuracy with only 0.38M parameters and 110.72M FLOPs on CIFAR10, and our parallel pruning method makes the parameters and FLOPs are just 0.37M and 100.12M, respectively, with little loss of accuracy. MDPI 2021-10-02 /pmc/articles/PMC8512244/ /pubmed/34640921 http://dx.doi.org/10.3390/s21196601 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Shao, Linsong
Zuo, Haorui
Zhang, Jianlin
Xu, Zhiyong
Yao, Jinzhen
Wang, Zhixing
Li, Hong
Filter Pruning via Measuring Feature Map Information
title Filter Pruning via Measuring Feature Map Information
title_full Filter Pruning via Measuring Feature Map Information
title_fullStr Filter Pruning via Measuring Feature Map Information
title_full_unstemmed Filter Pruning via Measuring Feature Map Information
title_short Filter Pruning via Measuring Feature Map Information
title_sort filter pruning via measuring feature map information
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8512244/
https://www.ncbi.nlm.nih.gov/pubmed/34640921
http://dx.doi.org/10.3390/s21196601
work_keys_str_mv AT shaolinsong filterpruningviameasuringfeaturemapinformation
AT zuohaorui filterpruningviameasuringfeaturemapinformation
AT zhangjianlin filterpruningviameasuringfeaturemapinformation
AT xuzhiyong filterpruningviameasuringfeaturemapinformation
AT yaojinzhen filterpruningviameasuringfeaturemapinformation
AT wangzhixing filterpruningviameasuringfeaturemapinformation
AT lihong filterpruningviameasuringfeaturemapinformation