ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1905.13367
14
46

PAC-Bayes Un-Expected Bernstein Inequality

31 May 2019
Zakaria Mhammedi
Peter Grünwald
Benjamin Guedj
ArXivPDFHTML
Abstract

We present a new PAC-Bayesian generalization bound. Standard bounds contain a Ln⋅\KL/n\sqrt{L_n \cdot \KL/n}Ln​⋅\KL/n​ complexity term which dominates unless LnL_nLn​, the empirical error of the learning algorithm's randomized predictions, vanishes. We manage to replace LnL_nLn​ by a term which vanishes in many more situations, essentially whenever the employed learning algorithm is sufficiently stable on the dataset at hand. Our new bound consistently beats state-of-the-art bounds both on a toy example and on UCI datasets (with large enough nnn). Theoretically, unlike existing bounds, our new bound can be expected to converge to 000 faster whenever a Bernstein/Tsybakov condition holds, thus connecting PAC-Bayesian generalization and {\em excess risk\/} bounds---for the latter it has long been known that faster convergence can be obtained under Bernstein conditions. Our main technical tool is a new concentration inequality which is like Bernstein's but with X2X^2X2 taken outside its expectation.

View on arXiv
Comments on this paper