ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1606.07104
76
41
v1v2v3v4v5v6v7 (latest)

Manifolds' Projective Approximation Using The Moving Least-Squares (MMLS)

22 June 2016
B. Sober
D. Levin
ArXiv (abs)PDFHTML
Abstract

In order to avoid the curse of dimensionality, frequently encountered in Big Data analysis, there was a vast development in the field of linear and non-linear dimension reduction techniques in recent years. These techniques (sometimes referred to as manifold learning) assume that the scattered input data is lying on a lower dimensional manifold, thus the high dimensionality problem can be overcome by learning the lower dimensionality behavior. However, in real life applications, data is often very noisy. In this work, we propose a method to approximate a ddd-dimensional Cm+1C^{m+1}Cm+1 smooth submanifold M\mathcal{M}M residing in Rn\mathbb{R}^nRn (d<<nd << nd<<n) based upon scattered data points (i.e., a data cloud). We assume that the data points are located "near" the noisy lower dimensional manifold and perform a non-linear moving least-squares projection on an approximating manifold. Under some mild assumptions, the resulting approximant is shown to be infinitely smooth and of approximation order of O(hm+1)O(h^{m+1})O(hm+1). Furthermore, the method presented here assumes no analytic knowledge of the approximated manifold and the approximation algorithm is linear in the large dimension nnn.

View on arXiv
Comments on this paper