ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1709.03615
33
18

Manifold Learning Using Kernel Density Estimation and Local Principal Components Analysis

11 September 2017
K. Mohammed
Hariharan Narayanan
ArXiv (abs)PDFHTML
Abstract

We consider the problem of recovering a d−d-d−dimensional manifold M⊂Rn\mathcal{M} \subset \mathbb{R}^nM⊂Rn when provided with noiseless samples from M\mathcal{M}M. There are many algorithms (e.g., Isomap) that are used in practice to fit manifolds and thus reduce the dimensionality of a given data set. Ideally, the estimate Mput\mathcal{M}_\mathrm{put}Mput​ of M\mathcal{M}M should be an actual manifold of a certain smoothness; furthermore, Mput\mathcal{M}_\mathrm{put}Mput​ should be arbitrarily close to M\mathcal{M}M in Hausdorff distance given a large enough sample. Generally speaking, existing manifold learning algorithms do not meet these criteria. Fefferman, Mitter, and Narayanan (2016) have developed an algorithm whose output is provably a manifold. The key idea is to define an approximate squared-distance function (asdf) to M\mathcal{M}M. Then, Mput\mathcal{M}_\mathrm{put}Mput​ is given by the set of points where the gradient of the asdf is orthogonal to the subspace spanned by the largest n−dn - dn−d eigenvectors of the Hessian of the asdf. As long as the asdf meets certain regularity conditions, Mput\mathcal{M}_\mathrm{put}Mput​ is a manifold that is arbitrarily close in Hausdorff distance to M\mathcal{M}M. In this paper, we define two asdfs that can be calculated from the data and show that they meet the required regularity conditions. The first asdf is based on kernel density estimation, and the second is based on estimation of tangent spaces using local principal components analysis.

View on arXiv
Comments on this paper