ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1406.6130
24
15

Generalized Mixability via Entropic Duality

24 June 2014
Mark D. Reid
Rafael Frongillo
Robert C. Williamson
Nishant A. Mehta
ArXivPDFHTML
Abstract

Mixability is a property of a loss which characterizes when fast convergence is possible in the game of prediction with expert advice. We show that a key property of mixability generalizes, and the exp and log operations present in the usual theory are not as special as one might have thought. In doing this we introduce a more general notion of Φ\PhiΦ-mixability where Φ\PhiΦ is a general entropy (\ie, any convex function on probabilities). We show how a property shared by the convex dual of any such entropy yields a natural algorithm (the minimizer of a regret bound) which, analogous to the classical aggregating algorithm, is guaranteed a constant regret when used with Φ\PhiΦ-mixable losses. We characterize precisely which Φ\PhiΦ have Φ\PhiΦ-mixable losses and put forward a number of conjectures about the optimality and relationships between different choices of entropy.

View on arXiv
Comments on this paper