ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.09046
23
2

Convex optimization over a probability simplex

15 May 2023
James Chok
G. Vasil
ArXivPDFHTML
Abstract

We propose a new iteration scheme, the Cauchy-Simplex, to optimize convex problems over the probability simplex {w∈Rn ∣ ∑iwi=1 and wi≥0}\{w\in\mathbb{R}^n\ |\ \sum_i w_i=1\ \textrm{and}\ w_i\geq0\}{w∈Rn ∣ ∑i​wi​=1 and wi​≥0}. Specifically, we map the simplex to the positive quadrant of a unit sphere, envisage gradient descent in latent variables, and map the result back in a way that only depends on the simplex variable. Moreover, proving rigorous convergence results in this formulation leads inherently to tools from information theory (e.g., cross-entropy and KL divergence). Each iteration of the Cauchy-Simplex consists of simple operations, making it well-suited for high-dimensional problems. In continuous time, we prove that f(xT)−f(x∗)=O(1/T)f(x_T)-f(x^*) = {O}(1/T)f(xT​)−f(x∗)=O(1/T) for differentiable real-valued convex functions, where TTT is the number of time steps and w∗w^*w∗ is the optimal solution. Numerical experiments of projection onto convex hulls show faster convergence than similar algorithms. Finally, we apply our algorithm to online learning problems and prove the convergence of the average regret for (1) Prediction with expert advice and (2) Universal Portfolios.

View on arXiv
@article{chok2025_2305.09046,
  title={ Convex optimization over a probability simplex },
  author={ James Chok and Geoffrey M. Vasil },
  journal={arXiv preprint arXiv:2305.09046},
  year={ 2025 }
}
Comments on this paper