ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2302.11905
20
3

The Geometry of Mixability

23 February 2023
Armando J. Cabrera Pacheco
Robert C. Williamson
ArXivPDFHTML
Abstract

Mixable loss functions are of fundamental importance in the context of prediction with expert advice in the online setting since they characterize fast learning rates. By re-interpreting properness from the point of view of differential geometry, we provide a simple geometric characterization of mixability for the binary and multi-class cases: a proper loss function ℓ\ellℓ is η\etaη-mixable if and only if the superpredition set spr(ηℓ)\textrm{spr}(\eta \ell)spr(ηℓ) of the scaled loss function ηℓ\eta \ellηℓ slides freely inside the superprediction set spr(ℓlog⁡)\textrm{spr}(\ell_{\log})spr(ℓlog​) of the log loss ℓlog⁡\ell_{\log}ℓlog​, under fairly general assumptions on the differentiability of ℓ\ellℓ. Our approach provides a way to treat some concepts concerning loss functions (like properness) in a ''coordinate-free'' manner and reconciles previous results obtained for mixable loss functions for the binary and the multi-class cases.

View on arXiv
Comments on this paper