ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1611.01179
60
20

Adaptive Geometric Multiscale Approximations for Intrinsically Low-dimensional Data

3 November 2016
Wenjing Liao
Mauro Maggioni
ArXivPDFHTML
Abstract

We consider the problem of efficiently approximating and encoding high-dimensional data sampled from a probability distribution ρ\rhoρ in RD\mathbb{R}^DRD, that is nearly supported on a ddd-dimensional set M\mathcal{M}M - for example supported on a ddd-dimensional Riemannian manifold. Geometric Multi-Resolution Analysis (GMRA) provides a robust and computationally efficient procedure to construct low-dimensional geometric approximations of M\mathcal{M}M at varying resolutions. We introduce a thresholding algorithm on the geometric wavelet coefficients, leading to what we call adaptive GMRA approximations. We show that these data-driven, empirical approximations perform well, when the threshold is chosen as a suitable universal function of the number of samples nnn, on a wide variety of measures ρ\rhoρ, that are allowed to exhibit different regularity at different scales and locations, thereby efficiently encoding data from more complex measures than those supported on manifolds. These approximations yield a data-driven dictionary, together with a fast transform mapping data to coefficients, and an inverse of such a map. The algorithms for both the dictionary construction and the transforms have complexity Cnlog⁡nC n \log nCnlogn with the constant linear in DDD and exponential in ddd. Our work therefore establishes adaptive GMRA as a fast dictionary learning algorithm with approximation guarantees. We include several numerical experiments on both synthetic and real data, confirming our theoretical results and demonstrating the effectiveness of adaptive GMRA.

View on arXiv
Comments on this paper