ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.08508
35
3

A PAC-Bayesian Link Between Generalisation and Flat Minima

13 February 2024
Maxime Haddouche
Paul Viallard
Umut Simsekli
Benjamin Guedj
ArXivPDFHTML
Abstract

Modern machine learning usually involves predictors in the overparameterised setting (number of trained parameters greater than dataset size), and their training yields not only good performance on training data, but also good generalisation capacity. This phenomenon challenges many theoretical results, and remains an open problem. To reach a better understanding, we provide novel generalisation bounds involving gradient terms. To do so, we combine the PAC-Bayes toolbox with Poincaré and Log-Sobolev inequalities, avoiding an explicit dependency on the dimension of the predictor space. Our results highlight the positive influence of flat minima (being minima with a neighbourhood nearly minimising the learning problem as well) on generalisation performance, involving directly the benefits of the optimisation phase.

View on arXiv
@article{haddouche2025_2402.08508,
  title={ A PAC-Bayesian Link Between Generalisation and Flat Minima },
  author={ Maxime Haddouche and Paul Viallard and Umut Simsekli and Benjamin Guedj },
  journal={arXiv preprint arXiv:2402.08508},
  year={ 2025 }
}
Comments on this paper