76
4
v1v2v3v4 (latest)

Generalisation under gradient descent via deterministic PAC-Bayes

Abstract

We establish disintegrated PAC-Bayesian generalisation bounds for models trained with gradient descent methods or continuous gradient flows. Contrary to standard practice in the PAC-Bayesian setting, our result applies to optimisation algorithms that are deterministic, without requiring any de-randomisation step. Our bounds are fully computable, depending on the density of the initial distribution and the Hessian of the training objective over the trajectory. We show that our framework can be applied to a variety of iterative optimisation algorithms, including stochastic gradient descent (SGD), momentum-based schemes, and damped Hamiltonian dynamics.

View on arXiv
@article{clerico2025_2209.02525,
  title={ Generalisation under gradient descent via deterministic PAC-Bayes },
  author={ Eugenio Clerico and Tyler Farghly and George Deligiannidis and Benjamin Guedj and Arnaud Doucet },
  journal={arXiv preprint arXiv:2209.02525},
  year={ 2025 }
}
Comments on this paper

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. See our policy.