52
2

Inexact subgradient methods for semialgebraic functions

Abstract

Motivated by the extensive application of approximate gradients in machine learning and optimization, we investigate inexact subgradient methods subject to persistent additive errors. Within a nonconvex semialgebraic framework, assuming boundedness or coercivity, we establish that the method yields iterates that eventually fluctuate near the critical set at a proximity characterized by an O(ϵρ)O(\epsilon^\rho) distance, where ϵ\epsilon denotes the magnitude of subgradient evaluation errors, and ρ\rho encapsulates geometric characteristics of the underlying problem. Our analysis comprehensively addresses both vanishing and constant step-size regimes. Notably, the latter regime inherently enlarges the fluctuation region, yet this enlargement remains on the order of ϵρ\epsilon^\rho. In the convex scenario, employing a universal error bound applicable to coercive semialgebraic functions, we derive novel complexity results concerning averaged iterates. Additionally, our study produces auxiliary results of independent interest, including descent-type lemmas for nonsmooth nonconvex functions and an invariance principle governing the behavior of algorithmic sequences under small-step limits.

View on arXiv
@article{bolte2025_2404.19517,
  title={ Inexact subgradient methods for semialgebraic functions },
  author={ Jérôme Bolte and Tam Le and Éric Moulines and Edouard Pauwels },
  journal={arXiv preprint arXiv:2404.19517},
  year={ 2025 }
}
Comments on this paper