Cargando…

Variable Smoothing for Convex Optimization Problems Using Stochastic Gradients

We aim to solve a structured convex optimization problem, where a nonsmooth function is composed with a linear operator. When opting for full splitting schemes, usually, primal–dual type methods are employed as they are effective and also well studied. However, under the additional assumption of Lip...

Descripción completa

Detalles Bibliográficos
Autores principales: Boţ, Radu Ioan, Böhm, Axel
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer US 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7581594/
https://www.ncbi.nlm.nih.gov/pubmed/33122873
http://dx.doi.org/10.1007/s10915-020-01332-8
Descripción
Sumario:We aim to solve a structured convex optimization problem, where a nonsmooth function is composed with a linear operator. When opting for full splitting schemes, usually, primal–dual type methods are employed as they are effective and also well studied. However, under the additional assumption of Lipschitz continuity of the nonsmooth function which is composed with the linear operator we can derive novel algorithms through regularization via the Moreau envelope. Furthermore, we tackle large scale problems by means of stochastic oracle calls, very similar to stochastic gradient techniques. Applications to total variational denoising and deblurring, and matrix factorization are provided.