ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1902.03720
16
1

Error Analysis on Graph Laplacian Regularized Estimator

11 February 2019
Kaige Yang
Xiaowen Dong
Laura Toni
ArXivPDFHTML
Abstract

We provide a theoretical analysis of the representation learning problem aimed at learning the latent variables (design matrix) Θ\ThetaΘ of observations YYY with the knowledge of the coefficient matrix XXX. The design matrix is learned under the assumption that the latent variables Θ\ThetaΘ are smooth with respect to a (known) topological structure G\mathcal{G}G. To learn such latent variables, we study a graph Laplacian regularized estimator, which is the penalized least squares estimator with penalty term proportional to a Laplacian quadratic form. This type of estimators has recently received considerable attention due to its capability in incorporating underlying topological graph structure of variables into the learning process. While the estimation problem can be solved efficiently by state-of-the-art optimization techniques, its statistical consistency properties have been largely overlooked. In this work, we develop a non-asymptotic bound of estimation error under the classical statistical setting, where sample size is larger than the ambient dimension of the latent variables. This bound illustrates theoretically the impact of the alignment between the data and the graph structure as well as the graph spectrum on the estimation accuracy. It also provides theoretical evidence of the advantage, in terms of convergence rate, of the graph Laplacian regularized estimator over classical ones (that ignore the graph structure) in case of a smoothness prior. Finally, we provide empirical results of the estimation error to corroborate the theoretical analysis.

View on arXiv
Comments on this paper