27
4

Rates of Convergence for Regression with the Graph Poly-Laplacian

Abstract

In the (special) smoothing spline problem one considers a variational problem with a quadratic data fidelity penalty and Laplacian regularisation. Higher order regularity can be obtained via replacing the Laplacian regulariser with a poly-Laplacian regulariser. The methodology is readily adapted to graphs and here we consider graph poly-Laplacian regularisation in a fully supervised, non-parametric, noise corrupted, regression problem. In particular, given a dataset {xi}i=1n\{x_i\}_{i=1}^n and a set of noisy labels {yi}i=1nR\{y_i\}_{i=1}^n\subset\mathbb{R} we let un:{xi}i=1nRu_n:\{x_i\}_{i=1}^n\to\mathbb{R} be the minimiser of an energy which consists of a data fidelity term and an appropriately scaled graph poly-Laplacian term. When yi=g(xi)+ξiy_i = g(x_i)+\xi_i, for iid noise ξi\xi_i, and using the geometric random graph, we identify (with high probability) the rate of convergence of unu_n to gg in the large data limit nn\to\infty. Furthermore, our rate, up to logarithms, coincides with the known rate of convergence in the usual smoothing spline model.

View on arXiv
Comments on this paper