ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.19475
18
0

Stochastic First-Order Methods with Non-smooth and Non-Euclidean Proximal Terms for Nonconvex High-Dimensional Stochastic Optimization

27 June 2024
Yue Xie
Jiawen Bi
Hongcheng Liu
ArXivPDFHTML
Abstract

When the nonconvex problem is complicated by stochasticity, the sample complexity of stochastic first-order methods may depend linearly on the problem dimension, which is undesirable for large-scale problems. In this work, we propose dimension-insensitive stochastic first-order methods (DISFOMs) to address nonconvex optimization with expected-valued objective function. Our algorithms allow for non-Euclidean and non-smooth distance functions as the proximal terms. Under mild assumptions, we show that DISFOM using minibatches to estimate the gradient enjoys sample complexity of O((log⁡d)/ϵ4) \mathcal{O} ( (\log d) / \epsilon^4 ) O((logd)/ϵ4) to obtain an ϵ\epsilonϵ-stationary point. Furthermore, we prove that DISFOM employing variance reduction can sharpen this bound to O((log⁡d)2/3/ϵ10/3)\mathcal{O} ( (\log d)^{2/3}/\epsilon^{10/3} )O((logd)2/3/ϵ10/3), which perhaps leads to the best-known sample complexity result in terms of ddd. We provide two choices of the non-smooth distance functions, both of which allow for closed-form solutions to the proximal step. Numerical experiments are conducted to illustrate the dimension insensitive property of the proposed frameworks.

View on arXiv
Comments on this paper