77
4

A PAC-Bayes bound for deterministic classifiers

Abstract

We establish a disintegrated PAC-Bayesian bound, for classifiers that are trained via continuous-time (non-stochastic) gradient descent. Contrarily to what is standard in the PAC-Bayesian setting, our result applies to a training algorithm that is deterministic, conditioned on a random initialisation, without requiring any de-randomisation\textit{de-randomisation} step. We provide a broad discussion of the main features of the bound that we propose, and we study analytically and empirically its behaviour on linear models, finding promising results.

View on arXiv
@article{clerico2025_2209.02525,
  title={ Generalisation under gradient descent via deterministic PAC-Bayes },
  author={ Eugenio Clerico and Tyler Farghly and George Deligiannidis and Benjamin Guedj and Arnaud Doucet },
  journal={arXiv preprint arXiv:2209.02525},
  year={ 2025 }
}
Comments on this paper