63
20

A maximum principle argument for the uniform convergence of graph Laplacian regressors

Abstract

We study asymptotic consistency guarantees for a non-parametric regression problem with Laplacian regularization. In particular, we consider (x1,y1),,(xn,yn)(x_1, y_1), \dots, (x_n, y_n) samples from some distribution on the cross product M×R\mathcal{M} \times \mathbb{R}, where M\mathcal{M} is a mm-dimensional manifold embedded in Rd\mathbb{R}^d. A geometric graph on the cloud {x1,,xn}\{x_1, \dots, x_n \} is constructed by connecting points that are within some specified distance εn\varepsilon_n. A suitable semi-linear equation involving the resulting graph Laplacian is used to obtain a regressor for the observed values of yy. We establish probabilistic error rates for the uniform difference between the regressor constructed from the observed data and the Bayes regressor (or trend) associated to the ground-truth distribution. We give the explicit dependence of the rates in terms of the parameter εn\varepsilon_n, the strength of regularization βn\beta_n, and the number of data points nn. Our argument relies on a simple, yet powerful, maximum principle for the graph Laplacian. We also address a simple extension of the framework to a semi-supervised setting.

View on arXiv
Comments on this paper