Cargando…

Compact Image-Style Transfer: Channel Pruning on the Single Training of a Network

Recent image-style transfer methods use the structure of a VGG feature network to encode and decode the feature map of the image. Since the network is designed for the general image-classification task, it has a number of channels and, accordingly, requires a huge amount of memory and high computati...

Descripción completa

Detalles Bibliográficos
Autores principales: Kim, Minseong, Choi, Hyun-Chul
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9653671/
https://www.ncbi.nlm.nih.gov/pubmed/36366125
http://dx.doi.org/10.3390/s22218427
_version_ 1784828738355068928
author Kim, Minseong
Choi, Hyun-Chul
author_facet Kim, Minseong
Choi, Hyun-Chul
author_sort Kim, Minseong
collection PubMed
description Recent image-style transfer methods use the structure of a VGG feature network to encode and decode the feature map of the image. Since the network is designed for the general image-classification task, it has a number of channels and, accordingly, requires a huge amount of memory and high computational power, which is not mandatory for such a relatively simple task as image-style transfer. In this paper, we propose a new technique to size down the previously used style transfer network for eliminating the redundancy of the VGG feature network in memory consumption and computational cost. Our method automatically finds a number of consistently inactive convolution channels during the network training phase by using two new losses, i.e., channel loss and xor loss. The former maximizes the number of inactive channels and the latter fixes the positions of these inactive channels to be the same for the image. Our method improves the image generation speed to be up to 49% faster and reduces the number of parameters by 20% while maintaining style transferring performance. Additionally, our losses are also effective in pruning the VGG16 classifier network, i.e., parameter reduction by 26% and top-1 accuracy improvement by 0.16% on CIFAR-10.
format Online
Article
Text
id pubmed-9653671
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-96536712022-11-15 Compact Image-Style Transfer: Channel Pruning on the Single Training of a Network Kim, Minseong Choi, Hyun-Chul Sensors (Basel) Article Recent image-style transfer methods use the structure of a VGG feature network to encode and decode the feature map of the image. Since the network is designed for the general image-classification task, it has a number of channels and, accordingly, requires a huge amount of memory and high computational power, which is not mandatory for such a relatively simple task as image-style transfer. In this paper, we propose a new technique to size down the previously used style transfer network for eliminating the redundancy of the VGG feature network in memory consumption and computational cost. Our method automatically finds a number of consistently inactive convolution channels during the network training phase by using two new losses, i.e., channel loss and xor loss. The former maximizes the number of inactive channels and the latter fixes the positions of these inactive channels to be the same for the image. Our method improves the image generation speed to be up to 49% faster and reduces the number of parameters by 20% while maintaining style transferring performance. Additionally, our losses are also effective in pruning the VGG16 classifier network, i.e., parameter reduction by 26% and top-1 accuracy improvement by 0.16% on CIFAR-10. MDPI 2022-11-02 /pmc/articles/PMC9653671/ /pubmed/36366125 http://dx.doi.org/10.3390/s22218427 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Kim, Minseong
Choi, Hyun-Chul
Compact Image-Style Transfer: Channel Pruning on the Single Training of a Network
title Compact Image-Style Transfer: Channel Pruning on the Single Training of a Network
title_full Compact Image-Style Transfer: Channel Pruning on the Single Training of a Network
title_fullStr Compact Image-Style Transfer: Channel Pruning on the Single Training of a Network
title_full_unstemmed Compact Image-Style Transfer: Channel Pruning on the Single Training of a Network
title_short Compact Image-Style Transfer: Channel Pruning on the Single Training of a Network
title_sort compact image-style transfer: channel pruning on the single training of a network
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9653671/
https://www.ncbi.nlm.nih.gov/pubmed/36366125
http://dx.doi.org/10.3390/s22218427
work_keys_str_mv AT kimminseong compactimagestyletransferchannelpruningonthesingletrainingofanetwork
AT choihyunchul compactimagestyletransferchannelpruningonthesingletrainingofanetwork