Cargando…

Reconstructing Kernel-Based Machine Learning Force Fields with Superlinear Convergence

[Image: see text] Kernel machines have sustained continuous progress in the field of quantum chemistry. In particular, they have proven to be successful in the low-data regime of force field reconstruction. This is because many equivariances and invariances due to physical symmetries can be incorpor...

Descripción completa

Detalles Bibliográficos
Autores principales: Blücher, Stefan, Müller, Klaus-Robert, Chmiela, Stefan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: American Chemical Society 2023
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10373489/
https://www.ncbi.nlm.nih.gov/pubmed/37156733
http://dx.doi.org/10.1021/acs.jctc.2c01304
_version_ 1785078579552321536
author Blücher, Stefan
Müller, Klaus-Robert
Chmiela, Stefan
author_facet Blücher, Stefan
Müller, Klaus-Robert
Chmiela, Stefan
author_sort Blücher, Stefan
collection PubMed
description [Image: see text] Kernel machines have sustained continuous progress in the field of quantum chemistry. In particular, they have proven to be successful in the low-data regime of force field reconstruction. This is because many equivariances and invariances due to physical symmetries can be incorporated into the kernel function to compensate for much larger data sets. So far, the scalability of kernel machines has however been hindered by its quadratic memory and cubical runtime complexity in the number of training points. While it is known that iterative Krylov subspace solvers can overcome these burdens, their convergence crucially relies on effective preconditioners, which are elusive in practice. Effective preconditioners need to partially presolve the learning problem in a computationally cheap and numerically robust manner. Here, we consider the broad class of Nyström-type methods to construct preconditioners based on successively more sophisticated low-rank approximations of the original kernel matrix, each of which provides a different set of computational trade-offs. All considered methods aim to identify a representative subset of inducing (kernel) columns to approximate the dominant kernel spectrum.
format Online
Article
Text
id pubmed-10373489
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher American Chemical Society
record_format MEDLINE/PubMed
spelling pubmed-103734892023-07-28 Reconstructing Kernel-Based Machine Learning Force Fields with Superlinear Convergence Blücher, Stefan Müller, Klaus-Robert Chmiela, Stefan J Chem Theory Comput [Image: see text] Kernel machines have sustained continuous progress in the field of quantum chemistry. In particular, they have proven to be successful in the low-data regime of force field reconstruction. This is because many equivariances and invariances due to physical symmetries can be incorporated into the kernel function to compensate for much larger data sets. So far, the scalability of kernel machines has however been hindered by its quadratic memory and cubical runtime complexity in the number of training points. While it is known that iterative Krylov subspace solvers can overcome these burdens, their convergence crucially relies on effective preconditioners, which are elusive in practice. Effective preconditioners need to partially presolve the learning problem in a computationally cheap and numerically robust manner. Here, we consider the broad class of Nyström-type methods to construct preconditioners based on successively more sophisticated low-rank approximations of the original kernel matrix, each of which provides a different set of computational trade-offs. All considered methods aim to identify a representative subset of inducing (kernel) columns to approximate the dominant kernel spectrum. American Chemical Society 2023-05-08 /pmc/articles/PMC10373489/ /pubmed/37156733 http://dx.doi.org/10.1021/acs.jctc.2c01304 Text en © 2023 The Authors. Published by American Chemical Society https://creativecommons.org/licenses/by-nc-nd/4.0/Permits non-commercial access and re-use, provided that author attribution and integrity are maintained; but does not permit creation of adaptations or other derivative works (https://creativecommons.org/licenses/by-nc-nd/4.0/).
spellingShingle Blücher, Stefan
Müller, Klaus-Robert
Chmiela, Stefan
Reconstructing Kernel-Based Machine Learning Force Fields with Superlinear Convergence
title Reconstructing Kernel-Based Machine Learning Force Fields with Superlinear Convergence
title_full Reconstructing Kernel-Based Machine Learning Force Fields with Superlinear Convergence
title_fullStr Reconstructing Kernel-Based Machine Learning Force Fields with Superlinear Convergence
title_full_unstemmed Reconstructing Kernel-Based Machine Learning Force Fields with Superlinear Convergence
title_short Reconstructing Kernel-Based Machine Learning Force Fields with Superlinear Convergence
title_sort reconstructing kernel-based machine learning force fields with superlinear convergence
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10373489/
https://www.ncbi.nlm.nih.gov/pubmed/37156733
http://dx.doi.org/10.1021/acs.jctc.2c01304
work_keys_str_mv AT blucherstefan reconstructingkernelbasedmachinelearningforcefieldswithsuperlinearconvergence
AT mullerklausrobert reconstructingkernelbasedmachinelearningforcefieldswithsuperlinearconvergence
AT chmielastefan reconstructingkernelbasedmachinelearningforcefieldswithsuperlinearconvergence