Cargando…

Image Reconstruction Based on Progressive Multistage Distillation Convolution Neural Network

To address the problem that some current algorithms suffer from the loss of some important features due to rough feature distillation and the loss of key information in some channels due to compressed channel attention in the network, we propose a progressive multistage distillation network that gra...

Descripción completa

Detalles Bibliográficos
Autores principales: Cai, Yuxi, Gao, Guxue, Jia, Zhenhong, Wang, Liejun, Lai, Huicheng
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Hindawi 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9110147/
https://www.ncbi.nlm.nih.gov/pubmed/35586112
http://dx.doi.org/10.1155/2022/9637460
_version_ 1784709036203048960
author Cai, Yuxi
Gao, Guxue
Jia, Zhenhong
Wang, Liejun
Lai, Huicheng
author_facet Cai, Yuxi
Gao, Guxue
Jia, Zhenhong
Wang, Liejun
Lai, Huicheng
author_sort Cai, Yuxi
collection PubMed
description To address the problem that some current algorithms suffer from the loss of some important features due to rough feature distillation and the loss of key information in some channels due to compressed channel attention in the network, we propose a progressive multistage distillation network that gradually refines the features in stages to obtain the maximum amount of key feature information in them. In addition, to maximize the network performance, we propose a weight-sharing information lossless attention block to enhance the channel characteristics through a weight-sharing auxiliary path and, at the same time, use convolution layers to model the interchannel dependencies without compression, effectively avoiding the previous problem of information loss in channel attention. Extensive experiments on several benchmark data sets show that the algorithm in this paper achieves a good balance between network performance, the number of parameters, and computational complexity and achieves highly competitive performance in both objective metrics and subjective vision, which indicates the advantages of this paper's algorithm for image reconstruction. It can be seen that this gradual feature distillation from coarse to fine is effective in improving network performance. Our code is available at the following link: https://github.com/Cai631/PMDN.
format Online
Article
Text
id pubmed-9110147
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Hindawi
record_format MEDLINE/PubMed
spelling pubmed-91101472022-05-17 Image Reconstruction Based on Progressive Multistage Distillation Convolution Neural Network Cai, Yuxi Gao, Guxue Jia, Zhenhong Wang, Liejun Lai, Huicheng Comput Intell Neurosci Research Article To address the problem that some current algorithms suffer from the loss of some important features due to rough feature distillation and the loss of key information in some channels due to compressed channel attention in the network, we propose a progressive multistage distillation network that gradually refines the features in stages to obtain the maximum amount of key feature information in them. In addition, to maximize the network performance, we propose a weight-sharing information lossless attention block to enhance the channel characteristics through a weight-sharing auxiliary path and, at the same time, use convolution layers to model the interchannel dependencies without compression, effectively avoiding the previous problem of information loss in channel attention. Extensive experiments on several benchmark data sets show that the algorithm in this paper achieves a good balance between network performance, the number of parameters, and computational complexity and achieves highly competitive performance in both objective metrics and subjective vision, which indicates the advantages of this paper's algorithm for image reconstruction. It can be seen that this gradual feature distillation from coarse to fine is effective in improving network performance. Our code is available at the following link: https://github.com/Cai631/PMDN. Hindawi 2022-05-09 /pmc/articles/PMC9110147/ /pubmed/35586112 http://dx.doi.org/10.1155/2022/9637460 Text en Copyright © 2022 Yuxi Cai et al. https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Research Article
Cai, Yuxi
Gao, Guxue
Jia, Zhenhong
Wang, Liejun
Lai, Huicheng
Image Reconstruction Based on Progressive Multistage Distillation Convolution Neural Network
title Image Reconstruction Based on Progressive Multistage Distillation Convolution Neural Network
title_full Image Reconstruction Based on Progressive Multistage Distillation Convolution Neural Network
title_fullStr Image Reconstruction Based on Progressive Multistage Distillation Convolution Neural Network
title_full_unstemmed Image Reconstruction Based on Progressive Multistage Distillation Convolution Neural Network
title_short Image Reconstruction Based on Progressive Multistage Distillation Convolution Neural Network
title_sort image reconstruction based on progressive multistage distillation convolution neural network
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9110147/
https://www.ncbi.nlm.nih.gov/pubmed/35586112
http://dx.doi.org/10.1155/2022/9637460
work_keys_str_mv AT caiyuxi imagereconstructionbasedonprogressivemultistagedistillationconvolutionneuralnetwork
AT gaoguxue imagereconstructionbasedonprogressivemultistagedistillationconvolutionneuralnetwork
AT jiazhenhong imagereconstructionbasedonprogressivemultistagedistillationconvolutionneuralnetwork
AT wangliejun imagereconstructionbasedonprogressivemultistagedistillationconvolutionneuralnetwork
AT laihuicheng imagereconstructionbasedonprogressivemultistagedistillationconvolutionneuralnetwork