ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1711.07271
56
5
v1v2v3v4 (latest)

Positive semi-definite embedding for dimensionality reduction and out-of-sample extensions

20 November 2017
Michaël Fanuel
Antoine Aspeel
Jean-Charles Delvenne
Johan A. K. Suykens
ArXiv (abs)PDFHTML
Abstract

In machine learning or statistics, it is often desirable to reduce the dimensionality of a sample of data points in a high dimensional space Rd\mathbb{R}^dRd. This paper introduces a dimensionality reduction method where the embedding coordinates are the eigenvectors of a positive semi-definite kernel obtained as the solution of an infinite dimensional analogue of a semi-definite program. This embedding is adaptive and non-linear. We discuss this problem both with weak and strong smoothness assumptions about the learned kernel. A main feature of our approach is the existence of an out-of-sample extension formula of the embedding coordinates in both cases. This extrapolation formula yields an extension of the kernel matrix to a data-dependent Mercer kernel function. Our empirical results indicate that this embedding method is more robust with respect to the influence of outliers, compared with a spectral embedding method.

View on arXiv
Comments on this paper