Cargando…

Nonlinear conjugate gradient methods for unconstrained optimization

Two approaches are known for solving large-scale unconstrained optimization problems—the limited-memory quasi-Newton method (truncated Newton method) and the conjugate gradient method. This is the first book to detail conjugate gradient methods, showing their properties and convergence characteristi...

Descripción completa

Detalles Bibliográficos
Autor principal: Andrei, Neculai
Lenguaje:eng
Publicado: Springer 2020
Materias:
Acceso en línea:https://dx.doi.org/10.1007/978-3-030-42950-8
http://cds.cern.ch/record/2722854
_version_ 1780965922036514816
author Andrei, Neculai
author_facet Andrei, Neculai
author_sort Andrei, Neculai
collection CERN
description Two approaches are known for solving large-scale unconstrained optimization problems—the limited-memory quasi-Newton method (truncated Newton method) and the conjugate gradient method. This is the first book to detail conjugate gradient methods, showing their properties and convergence characteristics as well as their performance in solving large-scale unconstrained optimization problems and applications. Comparisons to the limited-memory and truncated Newton methods are also discussed. Topics studied in detail include: linear conjugate gradient methods, standard conjugate gradient methods, acceleration of conjugate gradient methods, hybrid, modifications of the standard scheme, memoryless BFGS preconditioned, and three-term. Other conjugate gradient methods with clustering the eigenvalues or with the minimization of the condition number of the iteration matrix, are also treated. For each method, the convergence analysis, the computational performances and the comparisons versus other conjugate gradient methods are given. The theory behind the conjugate gradient algorithms presented as a methodology is developed with a clear, rigorous, and friendly exposition; the reader will gain an understanding of their properties and their convergence and will learn to develop and prove the convergence of his/her own methods. Numerous numerical studies are supplied with comparisons and comments on the behavior of conjugate gradient algorithms for solving a collection of 800 unconstrained optimization problems of different structures and complexities with the number of variables in the range [1000,10000]. The book is addressed to all those interested in developing and using new advanced techniques for solving unconstrained optimization complex problems. Mathematical programming researchers, theoreticians and practitioners in operations research, practitioners in engineering and industry researchers, as well as graduate students in mathematics, Ph.D. and master students in mathematical programming, will find plenty of information and practical applications for solving large-scale unconstrained optimization problems and applications by conjugate gradient methods.
id cern-2722854
institution Organización Europea para la Investigación Nuclear
language eng
publishDate 2020
publisher Springer
record_format invenio
spelling cern-27228542021-04-21T18:07:36Zdoi:10.1007/978-3-030-42950-8http://cds.cern.ch/record/2722854engAndrei, NeculaiNonlinear conjugate gradient methods for unconstrained optimizationMathematical Physics and MathematicsTwo approaches are known for solving large-scale unconstrained optimization problems—the limited-memory quasi-Newton method (truncated Newton method) and the conjugate gradient method. This is the first book to detail conjugate gradient methods, showing their properties and convergence characteristics as well as their performance in solving large-scale unconstrained optimization problems and applications. Comparisons to the limited-memory and truncated Newton methods are also discussed. Topics studied in detail include: linear conjugate gradient methods, standard conjugate gradient methods, acceleration of conjugate gradient methods, hybrid, modifications of the standard scheme, memoryless BFGS preconditioned, and three-term. Other conjugate gradient methods with clustering the eigenvalues or with the minimization of the condition number of the iteration matrix, are also treated. For each method, the convergence analysis, the computational performances and the comparisons versus other conjugate gradient methods are given. The theory behind the conjugate gradient algorithms presented as a methodology is developed with a clear, rigorous, and friendly exposition; the reader will gain an understanding of their properties and their convergence and will learn to develop and prove the convergence of his/her own methods. Numerous numerical studies are supplied with comparisons and comments on the behavior of conjugate gradient algorithms for solving a collection of 800 unconstrained optimization problems of different structures and complexities with the number of variables in the range [1000,10000]. The book is addressed to all those interested in developing and using new advanced techniques for solving unconstrained optimization complex problems. Mathematical programming researchers, theoreticians and practitioners in operations research, practitioners in engineering and industry researchers, as well as graduate students in mathematics, Ph.D. and master students in mathematical programming, will find plenty of information and practical applications for solving large-scale unconstrained optimization problems and applications by conjugate gradient methods.Springeroai:cds.cern.ch:27228542020
spellingShingle Mathematical Physics and Mathematics
Andrei, Neculai
Nonlinear conjugate gradient methods for unconstrained optimization
title Nonlinear conjugate gradient methods for unconstrained optimization
title_full Nonlinear conjugate gradient methods for unconstrained optimization
title_fullStr Nonlinear conjugate gradient methods for unconstrained optimization
title_full_unstemmed Nonlinear conjugate gradient methods for unconstrained optimization
title_short Nonlinear conjugate gradient methods for unconstrained optimization
title_sort nonlinear conjugate gradient methods for unconstrained optimization
topic Mathematical Physics and Mathematics
url https://dx.doi.org/10.1007/978-3-030-42950-8
http://cds.cern.ch/record/2722854
work_keys_str_mv AT andreineculai nonlinearconjugategradientmethodsforunconstrainedoptimization