Cargando…

Stochastic Recursive Gradient Support Pursuit and Its Sparse Representation Applications

In recent years, a series of matching pursuit and hard thresholding algorithms have been proposed to solve the sparse representation problem with [Formula: see text]-norm constraint. In addition, some stochastic hard thresholding methods were also proposed, such as stochastic gradient hard threshold...

Descripción completa

Detalles Bibliográficos
Autores principales: Shang, Fanhua, Wei, Bingkun, Liu, Yuanyuan, Liu, Hongying, Wang, Shuang, Jiao, Licheng
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7506682/
https://www.ncbi.nlm.nih.gov/pubmed/32872609
http://dx.doi.org/10.3390/s20174902
_version_ 1783585069913341952
author Shang, Fanhua
Wei, Bingkun
Liu, Yuanyuan
Liu, Hongying
Wang, Shuang
Jiao, Licheng
author_facet Shang, Fanhua
Wei, Bingkun
Liu, Yuanyuan
Liu, Hongying
Wang, Shuang
Jiao, Licheng
author_sort Shang, Fanhua
collection PubMed
description In recent years, a series of matching pursuit and hard thresholding algorithms have been proposed to solve the sparse representation problem with [Formula: see text]-norm constraint. In addition, some stochastic hard thresholding methods were also proposed, such as stochastic gradient hard thresholding (SG-HT) and stochastic variance reduced gradient hard thresholding (SVRGHT). However, each iteration of all the algorithms requires one hard thresholding operation, which leads to a high per-iteration complexity and slow convergence, especially for high-dimensional problems. To address this issue, we propose a new stochastic recursive gradient support pursuit (SRGSP) algorithm, in which only one hard thresholding operation is required in each outer-iteration. Thus, SRGSP has a significantly lower computational complexity than existing methods such as SG-HT and SVRGHT. Moreover, we also provide the convergence analysis of SRGSP, which shows that SRGSP attains a linear convergence rate. Our experimental results on large-scale synthetic and real-world datasets verify that SRGSP outperforms state-of-the-art related methods for tackling various sparse representation problems. Moreover, we conduct many experiments on two real-world sparse representation applications such as image denoising and face recognition, and all the results also validate that our SRGSP algorithm obtains much better performance than other sparse representation learning optimization methods in terms of PSNR and recognition rates.
format Online
Article
Text
id pubmed-7506682
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-75066822020-09-26 Stochastic Recursive Gradient Support Pursuit and Its Sparse Representation Applications Shang, Fanhua Wei, Bingkun Liu, Yuanyuan Liu, Hongying Wang, Shuang Jiao, Licheng Sensors (Basel) Article In recent years, a series of matching pursuit and hard thresholding algorithms have been proposed to solve the sparse representation problem with [Formula: see text]-norm constraint. In addition, some stochastic hard thresholding methods were also proposed, such as stochastic gradient hard thresholding (SG-HT) and stochastic variance reduced gradient hard thresholding (SVRGHT). However, each iteration of all the algorithms requires one hard thresholding operation, which leads to a high per-iteration complexity and slow convergence, especially for high-dimensional problems. To address this issue, we propose a new stochastic recursive gradient support pursuit (SRGSP) algorithm, in which only one hard thresholding operation is required in each outer-iteration. Thus, SRGSP has a significantly lower computational complexity than existing methods such as SG-HT and SVRGHT. Moreover, we also provide the convergence analysis of SRGSP, which shows that SRGSP attains a linear convergence rate. Our experimental results on large-scale synthetic and real-world datasets verify that SRGSP outperforms state-of-the-art related methods for tackling various sparse representation problems. Moreover, we conduct many experiments on two real-world sparse representation applications such as image denoising and face recognition, and all the results also validate that our SRGSP algorithm obtains much better performance than other sparse representation learning optimization methods in terms of PSNR and recognition rates. MDPI 2020-08-30 /pmc/articles/PMC7506682/ /pubmed/32872609 http://dx.doi.org/10.3390/s20174902 Text en © 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Shang, Fanhua
Wei, Bingkun
Liu, Yuanyuan
Liu, Hongying
Wang, Shuang
Jiao, Licheng
Stochastic Recursive Gradient Support Pursuit and Its Sparse Representation Applications
title Stochastic Recursive Gradient Support Pursuit and Its Sparse Representation Applications
title_full Stochastic Recursive Gradient Support Pursuit and Its Sparse Representation Applications
title_fullStr Stochastic Recursive Gradient Support Pursuit and Its Sparse Representation Applications
title_full_unstemmed Stochastic Recursive Gradient Support Pursuit and Its Sparse Representation Applications
title_short Stochastic Recursive Gradient Support Pursuit and Its Sparse Representation Applications
title_sort stochastic recursive gradient support pursuit and its sparse representation applications
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7506682/
https://www.ncbi.nlm.nih.gov/pubmed/32872609
http://dx.doi.org/10.3390/s20174902
work_keys_str_mv AT shangfanhua stochasticrecursivegradientsupportpursuitanditssparserepresentationapplications
AT weibingkun stochasticrecursivegradientsupportpursuitanditssparserepresentationapplications
AT liuyuanyuan stochasticrecursivegradientsupportpursuitanditssparserepresentationapplications
AT liuhongying stochasticrecursivegradientsupportpursuitanditssparserepresentationapplications
AT wangshuang stochasticrecursivegradientsupportpursuitanditssparserepresentationapplications
AT jiaolicheng stochasticrecursivegradientsupportpursuitanditssparserepresentationapplications