ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2106.01529
13
15

Minimax Optimal Regression over Sobolev Spaces via Laplacian Regularization on Neighborhood Graphs

3 June 2021
Alden Green
Sivaraman Balakrishnan
R. Tibshirani
ArXivPDFHTML
Abstract

In this paper we study the statistical properties of Laplacian smoothing, a graph-based approach to nonparametric regression. Under standard regularity conditions, we establish upper bounds on the error of the Laplacian smoothing estimator f^\widehat{f}f​, and a goodness-of-fit test also based on f^\widehat{f}f​. These upper bounds match the minimax optimal estimation and testing rates of convergence over the first-order Sobolev class H1(X)H^1(\mathcal{X})H1(X), for X⊆Rd\mathcal{X}\subseteq \mathbb{R}^dX⊆Rd and 1≤d<41 \leq d < 41≤d<4; in the estimation problem, for d=4d = 4d=4, they are optimal modulo a log⁡n\log nlogn factor. Additionally, we prove that Laplacian smoothing is manifold-adaptive: if X⊆Rd\mathcal{X} \subseteq \mathbb{R}^dX⊆Rd is an mmm-dimensional manifold with m<dm < dm<d, then the error rate of Laplacian smoothing (in either estimation or testing) depends only on mmm, in the same way it would if X\mathcal{X}X were a full-dimensional set in Rd\mathbb{R}^dRd.

View on arXiv
Comments on this paper