78
0

A stochastic first-order method with multi-extrapolated momentum for highly smooth unconstrained optimization

Abstract

In this paper, we consider an unconstrained stochastic optimization problem where the objective function exhibits high-order smoothness. Specifically, we propose a new stochastic first-order method (SFOM) with multi-extrapolated momentum, in which multiple extrapolations are performed in each iteration, followed by a momentum update based on these extrapolations. We demonstrate that the proposed SFOM can accelerate optimization by exploiting the high-order smoothness of the objective function ff. Assuming that the ppth-order derivative of ff is Lipschitz continuous for some p2p\ge2, and under additional mild assumptions, we establish that our method achieves a sample complexity of O~(ϵ(3p+1)/p)\widetilde{\mathcal{O}}(\epsilon^{-(3p+1)/p}) for finding a point xx such that E[f(x)]ϵ\mathbb{E}[\|\nabla f(x)\|]\le\epsilon. To the best of our knowledge, this is the first SFOM to leverage arbitrary-order smoothness of the objective function for acceleration, resulting in a sample complexity that improves upon the best-known results without assuming the mean-squared smoothness condition. Preliminary numerical experiments validate the practical performance of our method and support our theoretical findings.

View on arXiv
@article{he2025_2412.14488,
  title={ A stochastic first-order method with multi-extrapolated momentum for highly smooth unconstrained optimization },
  author={ Chuan He },
  journal={arXiv preprint arXiv:2412.14488},
  year={ 2025 }
}
Comments on this paper