ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2007.12625
13
21

Accelerated Stochastic Gradient-free and Projection-free Methods

16 July 2020
Feihu Huang
Lue Tao
Songcan Chen
ArXivPDFHTML
Abstract

In the paper, we propose a class of accelerated stochastic gradient-free and projection-free (a.k.a., zeroth-order Frank-Wolfe) methods to solve the constrained stochastic and finite-sum nonconvex optimization. Specifically, we propose an accelerated stochastic zeroth-order Frank-Wolfe (Acc-SZOFW) method based on the variance reduced technique of SPIDER/SpiderBoost and a novel momentum accelerated technique. Moreover, under some mild conditions, we prove that the Acc-SZOFW has the function query complexity of O(dnϵ−2)O(d\sqrt{n}\epsilon^{-2})O(dn​ϵ−2) for finding an ϵ\epsilonϵ-stationary point in the finite-sum problem, which improves the exiting best result by a factor of O(nϵ−2)O(\sqrt{n}\epsilon^{-2})O(n​ϵ−2), and has the function query complexity of O(dϵ−3)O(d\epsilon^{-3})O(dϵ−3) in the stochastic problem, which improves the exiting best result by a factor of O(ϵ−1)O(\epsilon^{-1})O(ϵ−1). To relax the large batches required in the Acc-SZOFW, we further propose a novel accelerated stochastic zeroth-order Frank-Wolfe (Acc-SZOFW*) based on a new variance reduced technique of STORM, which still reaches the function query complexity of O(dϵ−3)O(d\epsilon^{-3})O(dϵ−3) in the stochastic problem without relying on any large batches. In particular, we present an accelerated framework of the Frank-Wolfe methods based on the proposed momentum accelerated technique. The extensive experimental results on black-box adversarial attack and robust black-box classification demonstrate the efficiency of our algorithms.

View on arXiv
Comments on this paper