ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2208.01185
26
0

A Note on Zeroth-Order Optimization on the Simplex

2 August 2022
Tijana Zrnic
Eric Mazumdar
ArXivPDFHTML
Abstract

We construct a zeroth-order gradient estimator for a smooth function defined on the probability simplex. The proposed estimator queries the simplex only. We prove that projected gradient descent and the exponential weights algorithm, when run with this estimator instead of exact gradients, converge at a O(T−1/4)\mathcal O(T^{-1/4})O(T−1/4) rate.

View on arXiv
Comments on this paper