45
2

Generalization Bounds for Dependent Data using Online-to-Batch Conversion

Abstract

In this work, we upper bound the generalization error of batch learning algorithms trained on samples drawn from a mixing stochastic process (i.e., a dependent data source) both in expectation and with high probability. Unlike previous results by Mohri et al. (2010) and Fu et al. (2023), our work does not require any stability assumptions on the batch learner, which allows us to derive upper bounds for any batch learning algorithm trained on dependent data. This is made possible due to our use of the Online-to-Batch ( OTB ) conversion framework, which allows us to shift the burden of stability from the batch learner to an artificially constructed online learner. We show that our bounds are equal to the bounds in the i.i.d. setting up to a term that depends on the decay rate of the underlying mixing stochastic process. Central to our analysis is a new notion of algorithmic stability for online learning algorithms based on Wasserstein distances of order one. Furthermore, we prove that the EWA algorithm, a textbook family of online learning algorithms, satisfies our new notion of stability. Following this, we instantiate our bounds using the EWA algorithm.

View on arXiv
@article{chatterjee2025_2405.13666,
  title={ Generalization Bounds for Dependent Data using Online-to-Batch Conversion },
  author={ Sagnik Chatterjee and Manuj Mukherjee and Alhad Sethi },
  journal={arXiv preprint arXiv:2405.13666},
  year={ 2025 }
}
Comments on this paper