Cargando…

Proximal extrapolated gradient methods for variational inequalities

The paper concerns with novel first-order methods for monotone variational inequalities. They use a very simple linesearch procedure that takes into account a local information of the operator. Also, the methods do not require Lipschitz continuity of the operator and the linesearch procedure uses on...

Descripción completa

Detalles Bibliográficos
Autor principal: Malitsky, Yu
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Taylor & Francis 2017
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5751890/
https://www.ncbi.nlm.nih.gov/pubmed/29348705
http://dx.doi.org/10.1080/10556788.2017.1300899
_version_ 1783290041780404224
author Malitsky, Yu
author_facet Malitsky, Yu
author_sort Malitsky, Yu
collection PubMed
description The paper concerns with novel first-order methods for monotone variational inequalities. They use a very simple linesearch procedure that takes into account a local information of the operator. Also, the methods do not require Lipschitz continuity of the operator and the linesearch procedure uses only values of the operator. Moreover, when the operator is affine our linesearch becomes very simple, namely, it needs only simple vector–vector operations. For all our methods, we establish the ergodic convergence rate. In addition, we modify one of the proposed methods for the case of a composite minimization. Preliminary results from numerical experiments are quite promising.
format Online
Article
Text
id pubmed-5751890
institution National Center for Biotechnology Information
language English
publishDate 2017
publisher Taylor & Francis
record_format MEDLINE/PubMed
spelling pubmed-57518902018-01-16 Proximal extrapolated gradient methods for variational inequalities Malitsky, Yu Optim Methods Softw Original Articles The paper concerns with novel first-order methods for monotone variational inequalities. They use a very simple linesearch procedure that takes into account a local information of the operator. Also, the methods do not require Lipschitz continuity of the operator and the linesearch procedure uses only values of the operator. Moreover, when the operator is affine our linesearch becomes very simple, namely, it needs only simple vector–vector operations. For all our methods, we establish the ergodic convergence rate. In addition, we modify one of the proposed methods for the case of a composite minimization. Preliminary results from numerical experiments are quite promising. Taylor & Francis 2017-03-21 /pmc/articles/PMC5751890/ /pubmed/29348705 http://dx.doi.org/10.1080/10556788.2017.1300899 Text en © 2017 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group http://creativecommons.org/Licenses/by/4.0/ This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/Licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Original Articles
Malitsky, Yu
Proximal extrapolated gradient methods for variational inequalities
title Proximal extrapolated gradient methods for variational inequalities
title_full Proximal extrapolated gradient methods for variational inequalities
title_fullStr Proximal extrapolated gradient methods for variational inequalities
title_full_unstemmed Proximal extrapolated gradient methods for variational inequalities
title_short Proximal extrapolated gradient methods for variational inequalities
title_sort proximal extrapolated gradient methods for variational inequalities
topic Original Articles
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5751890/
https://www.ncbi.nlm.nih.gov/pubmed/29348705
http://dx.doi.org/10.1080/10556788.2017.1300899
work_keys_str_mv AT malitskyyu proximalextrapolatedgradientmethodsforvariationalinequalities