Cargando…

Regularized gradient-projection methods for finding the minimum-norm solution of the constrained convex minimization problem

Let H be a real Hilbert space and C be a nonempty closed convex subset of H. Assume that g is a real-valued convex function and the gradient ∇g is [Formula: see text] -ism with [Formula: see text] . Let [Formula: see text] , [Formula: see text] . We prove that the sequence [Formula: see text] genera...

Descripción completa

Detalles Bibliográficos
Autores principales: Tian, Ming, Zhang, Hui-Fang
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer International Publishing 2017
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5222927/
https://www.ncbi.nlm.nih.gov/pubmed/28111511
http://dx.doi.org/10.1186/s13660-016-1289-4
Descripción
Sumario:Let H be a real Hilbert space and C be a nonempty closed convex subset of H. Assume that g is a real-valued convex function and the gradient ∇g is [Formula: see text] -ism with [Formula: see text] . Let [Formula: see text] , [Formula: see text] . We prove that the sequence [Formula: see text] generated by the iterative algorithm [Formula: see text] , [Formula: see text] converges strongly to [Formula: see text] , where [Formula: see text] is the minimum-norm solution of the constrained convex minimization problem, which also solves the variational inequality [Formula: see text] , [Formula: see text] . Under suitable conditions, we obtain some strong convergence theorems. As an application, we apply our algorithm to solving the split feasibility problem in Hilbert spaces.