Online Estimation with Rolling Validation: Adaptive Nonparametric Estimation with Streaming Data

Online nonparametric estimators are gaining popularity due to their efficient computation and competitive generalization abilities. An important example includes variants of stochastic gradient descent. These algorithms often take one sample point at a time and incrementally update the parameter estimate of interest. In this work, we consider model selection/hyperparameter tuning for such online algorithms. We propose a weighted rolling validation procedure, an online variant of leave-one-out cross-validation, that costs minimal extra computation for many typical stochastic gradient descent estimators and maintains their online nature. Similar to batch cross-validation, it can boost base estimators to achieve better heuristic performance and adaptive convergence rate. Our analysis is straightforward, relying mainly on some general statistical stability assumptions. The simulation study underscores the significance of diverging weights in practice and demonstrates its favorable sensitivity even when there is only a slim difference between candidate estimators.
View on arXiv@article{zhang2025_2310.12140, title={ Online Estimation with Rolling Validation: Adaptive Nonparametric Estimation with Streaming Data }, author={ Tianyu Zhang and Jing Lei }, journal={arXiv preprint arXiv:2310.12140}, year={ 2025 } }