Cargando…

Rates of convergence for regression with the graph poly-Laplacian

In the (special) smoothing spline problem one considers a variational problem with a quadratic data fidelity penalty and Laplacian regularization. Higher order regularity can be obtained via replacing the Laplacian regulariser with a poly-Laplacian regulariser. The methodology is readily adapted to...

Descripción completa

Detalles Bibliográficos
Autores principales: Trillos, Nicolás García, Murray, Ryan, Thorpe, Matthew
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer International Publishing 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10682086/
https://www.ncbi.nlm.nih.gov/pubmed/38037599
http://dx.doi.org/10.1007/s43670-023-00075-5
Descripción
Sumario:In the (special) smoothing spline problem one considers a variational problem with a quadratic data fidelity penalty and Laplacian regularization. Higher order regularity can be obtained via replacing the Laplacian regulariser with a poly-Laplacian regulariser. The methodology is readily adapted to graphs and here we consider graph poly-Laplacian regularization in a fully supervised, non-parametric, noise corrupted, regression problem. In particular, given a dataset [Formula: see text] and a set of noisy labels [Formula: see text] we let [Formula: see text] be the minimizer of an energy which consists of a data fidelity term and an appropriately scaled graph poly-Laplacian term. When [Formula: see text] , for iid noise [Formula: see text] , and using the geometric random graph, we identify (with high probability) the rate of convergence of [Formula: see text] to g in the large data limit [Formula: see text] . Furthermore, our rate is close to the known rate of convergence in the usual smoothing spline model.