Cargando…

Neural Network Structure Optimization by Simulated Annealing

A critical problem in large neural networks is over parameterization with a large number of weight parameters, which limits their use on edge devices due to prohibitive computational power and memory/storage requirements. To make neural networks more practical on edge devices and real-time industria...

Descripción completa

Detalles Bibliográficos
Autores principales: Kuo, Chun Lin, Kuruoglu, Ercan Engin, Chan, Wai Kin Victor
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8947290/
https://www.ncbi.nlm.nih.gov/pubmed/35327859
http://dx.doi.org/10.3390/e24030348
_version_ 1784674404571021312
author Kuo, Chun Lin
Kuruoglu, Ercan Engin
Chan, Wai Kin Victor
author_facet Kuo, Chun Lin
Kuruoglu, Ercan Engin
Chan, Wai Kin Victor
author_sort Kuo, Chun Lin
collection PubMed
description A critical problem in large neural networks is over parameterization with a large number of weight parameters, which limits their use on edge devices due to prohibitive computational power and memory/storage requirements. To make neural networks more practical on edge devices and real-time industrial applications, they need to be compressed in advance. Since edge devices cannot train or access trained networks when internet resources are scarce, the preloading of smaller networks is essential. Various works in the literature have shown that the redundant branches can be pruned strategically in a fully connected network without sacrificing the performance significantly. However, majority of these methodologies need high computational resources to integrate weight training via the back-propagation algorithm during the process of network compression. In this work, we draw attention to the optimization of the network structure for preserving performance despite compression by pruning aggressively. The structure optimization is performed using the simulated annealing algorithm only, without utilizing back-propagation for branch weight training. Being a heuristic-based, non-convex optimization method, simulated annealing provides a globally near-optimal solution to this NP-hard problem for a given percentage of branch pruning. Our simulation results have shown that simulated annealing can significantly reduce the complexity of a fully connected network while maintaining the performance without the help of back-propagation.
format Online
Article
Text
id pubmed-8947290
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-89472902022-03-25 Neural Network Structure Optimization by Simulated Annealing Kuo, Chun Lin Kuruoglu, Ercan Engin Chan, Wai Kin Victor Entropy (Basel) Article A critical problem in large neural networks is over parameterization with a large number of weight parameters, which limits their use on edge devices due to prohibitive computational power and memory/storage requirements. To make neural networks more practical on edge devices and real-time industrial applications, they need to be compressed in advance. Since edge devices cannot train or access trained networks when internet resources are scarce, the preloading of smaller networks is essential. Various works in the literature have shown that the redundant branches can be pruned strategically in a fully connected network without sacrificing the performance significantly. However, majority of these methodologies need high computational resources to integrate weight training via the back-propagation algorithm during the process of network compression. In this work, we draw attention to the optimization of the network structure for preserving performance despite compression by pruning aggressively. The structure optimization is performed using the simulated annealing algorithm only, without utilizing back-propagation for branch weight training. Being a heuristic-based, non-convex optimization method, simulated annealing provides a globally near-optimal solution to this NP-hard problem for a given percentage of branch pruning. Our simulation results have shown that simulated annealing can significantly reduce the complexity of a fully connected network while maintaining the performance without the help of back-propagation. MDPI 2022-02-28 /pmc/articles/PMC8947290/ /pubmed/35327859 http://dx.doi.org/10.3390/e24030348 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Kuo, Chun Lin
Kuruoglu, Ercan Engin
Chan, Wai Kin Victor
Neural Network Structure Optimization by Simulated Annealing
title Neural Network Structure Optimization by Simulated Annealing
title_full Neural Network Structure Optimization by Simulated Annealing
title_fullStr Neural Network Structure Optimization by Simulated Annealing
title_full_unstemmed Neural Network Structure Optimization by Simulated Annealing
title_short Neural Network Structure Optimization by Simulated Annealing
title_sort neural network structure optimization by simulated annealing
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8947290/
https://www.ncbi.nlm.nih.gov/pubmed/35327859
http://dx.doi.org/10.3390/e24030348
work_keys_str_mv AT kuochunlin neuralnetworkstructureoptimizationbysimulatedannealing
AT kuruogluercanengin neuralnetworkstructureoptimizationbysimulatedannealing
AT chanwaikinvictor neuralnetworkstructureoptimizationbysimulatedannealing