ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.01702
33
1

Optimization without Retraction on the Random Generalized Stiefel Manifold

2 May 2024
Simon Vary
Pierre Ablin
Bin Gao
P.-A. Absil
ArXivPDFHTML
Abstract

Optimization over the set of matrices XXX that satisfy X⊤BX=IpX^\top B X = I_pX⊤BX=Ip​, referred to as the generalized Stiefel manifold, appears in many applications involving sampled covariance matrices such as the canonical correlation analysis (CCA), independent component analysis (ICA), and the generalized eigenvalue problem (GEVP). Solving these problems is typically done by iterative methods that require a fully formed BBB. We propose a cheap stochastic iterative method that solves the optimization problem while having access only to a random estimates of BBB. Our method does not enforce the constraint in every iteration; instead, it produces iterations that converge to critical points on the generalized Stiefel manifold defined in expectation. The method has lower per-iteration cost, requires only matrix multiplications, and has the same convergence rates as its Riemannian optimization counterparts that require the full matrix BBB. Experiments demonstrate its effectiveness in various machine learning applications involving generalized orthogonality constraints, including CCA, ICA, and the GEVP.

View on arXiv
Comments on this paper