Cargando…

An incremental mirror descent subgradient algorithm with random sweeping and proximal step

We investigate the convergence properties of incremental mirror descent type subgradient algorithms for minimizing the sum of convex functions. In each step, we only evaluate the subgradient of a single component function and mirror it back to the feasible domain, which makes iterations very cheap t...

Descripción completa

Detalles Bibliográficos
Autores principales: Boţ, Radu Ioan, Böhm, Axel
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Taylor & Francis 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6382287/
https://www.ncbi.nlm.nih.gov/pubmed/30828224
http://dx.doi.org/10.1080/02331934.2018.1482491
Descripción
Sumario:We investigate the convergence properties of incremental mirror descent type subgradient algorithms for minimizing the sum of convex functions. In each step, we only evaluate the subgradient of a single component function and mirror it back to the feasible domain, which makes iterations very cheap to compute. The analysis is made for a randomized selection of the component functions, which yields the deterministic algorithm as a special case. Under supplementary differentiability assumptions on the function which induces the mirror map, we are also able to deal with the presence of another term in the objective function, which is evaluated via a proximal type step. In both cases, we derive convergence rates of [Image: see text] in expectation for the kth best objective function value and illustrate our theoretical findings by numerical experiments in positron emission tomography and machine learning.