ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.08428
26
0

Sharper Convergence Rates for Nonconvex Optimisation via Reduction Mappings

10 June 2025
Evan Markou
Thalaiyasingam Ajanthan
Stephen Gould
ArXiv (abs)PDFHTML
Abstract

Many high-dimensional optimisation problems exhibit rich geometric structures in their set of minimisers, often forming smooth manifolds due to over-parametrisation or symmetries. When this structure is known, at least locally, it can be exploited through reduction mappings that reparametrise part of the parameter space to lie on the solution manifold. These reductions naturally arise from inner optimisation problems and effectively remove redundant directions, yielding a lower-dimensional objective. In this work, we introduce a general framework to understand how such reductions influence the optimisation landscape. We show that well-designed reduction mappings improve curvature properties of the objective, leading to better-conditioned problems and theoretically faster convergence for gradient-based methods. Our analysis unifies a range of scenarios where structural information at optimality is leveraged to accelerate convergence, offering a principled explanation for the empirical gains observed in such optimisation algorithms.

View on arXiv
@article{markou2025_2506.08428,
  title={ Sharper Convergence Rates for Nonconvex Optimisation via Reduction Mappings },
  author={ Evan Markou and Thalaiyasingam Ajanthan and Stephen Gould },
  journal={arXiv preprint arXiv:2506.08428},
  year={ 2025 }
}
Comments on this paper