ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.05101
11
3

Tighter Generalisation Bounds via Interpolation

7 February 2024
Paul Viallard
Maxime Haddouche
Umut Simsekli
Benjamin Guedj
ArXivPDFHTML
Abstract

This paper contains a recipe for deriving new PAC-Bayes generalisation bounds based on the (f,Γ)(f, \Gamma)(f,Γ)-divergence, and, in addition, presents PAC-Bayes generalisation bounds where we interpolate between a series of probability divergences (including but not limited to KL, Wasserstein, and total variation), making the best out of many worlds depending on the posterior distributions properties. We explore the tightness of these bounds and connect them to earlier results from statistical learning, which are specific cases. We also instantiate our bounds as training objectives, yielding non-trivial guarantees and practical performances.

View on arXiv
Comments on this paper