ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2110.03960
39
10

Mixability made efficient: Fast online multiclass logistic regression

8 October 2021
Rémi Jézéquel
Pierre Gaillard
Alessandro Rudi
ArXivPDFHTML
Abstract

Mixability has been shown to be a powerful tool to obtain algorithms with optimal regret. However, the resulting methods often suffer from high computational complexity which has reduced their practical applicability. For example, in the case of multiclass logistic regression, the aggregating forecaster (Foster et al. (2018)) achieves a regret of O(log⁡(Bn))O(\log(Bn))O(log(Bn)) whereas Online Newton Step achieves O(eBlog⁡(n))O(e^B\log(n))O(eBlog(n)) obtaining a double exponential gain in BBB (a bound on the norm of comparative functions). However, this high statistical performance is at the price of a prohibitive computational complexity O(n37)O(n^{37})O(n37).

View on arXiv
Comments on this paper