Faster Gradient-Free Algorithms for Nonsmooth Nonconvex Stochastic Optimization

Abstract
We consider the optimization problem of the form , where the component is -mean-squared Lipschitz but possibly nonconvex and nonsmooth. The recently proposed gradient-free method requires at most stochastic zeroth-order oracle complexity to find a -Goldstein stationary point of objective function, where and is the initial point of the algorithm. This paper proposes a more efficient algorithm using stochastic recursive gradient estimators, which improves the complexity to .
View on arXivComments on this paper