ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1901.10089
63
20
v1v2v3 (latest)

A maximum principle argument for the uniform convergence of graph Laplacian regressors

29 January 2019
Nicolas García Trillos
Ryan W. Murray
ArXiv (abs)PDFHTML
Abstract

We study asymptotic consistency guarantees for a non-parametric regression problem with Laplacian regularization. In particular, we consider (x1,y1),…,(xn,yn)(x_1, y_1), \dots, (x_n, y_n)(x1​,y1​),…,(xn​,yn​) samples from some distribution on the cross product M×R\mathcal{M} \times \mathbb{R}M×R, where M\mathcal{M}M is a mmm-dimensional manifold embedded in Rd\mathbb{R}^dRd. A geometric graph on the cloud {x1,…,xn}\{x_1, \dots, x_n \}{x1​,…,xn​} is constructed by connecting points that are within some specified distance εn\varepsilon_nεn​. A suitable semi-linear equation involving the resulting graph Laplacian is used to obtain a regressor for the observed values of yyy. We establish probabilistic error rates for the uniform difference between the regressor constructed from the observed data and the Bayes regressor (or trend) associated to the ground-truth distribution. We give the explicit dependence of the rates in terms of the parameter εn\varepsilon_nεn​, the strength of regularization βn\beta_nβn​, and the number of data points nnn. Our argument relies on a simple, yet powerful, maximum principle for the graph Laplacian. We also address a simple extension of the framework to a semi-supervised setting.

View on arXiv
Comments on this paper