Quantifying Overfitting along the Regularization Path for Two-Part-Code MDL in Supervised Classification
We provide a complete characterization of the entire regularization curve of a modified two-part-code Minimum Description Length (MDL) learning rule for binary classification, based on an arbitrary prior or description language. Grunwald and Langford [2004] previously established the lack of asymptotic consistency, from an agnostic PAC (frequentist worst case) perspective, of the MDL rule with a penalty parameter of , suggesting that it underegularizes. Driven by interest in understanding how benign or catastrophic under-regularization and overfitting might be, we obtain a precise quantitative description of the worst case limiting error as a function of the regularization parameter and noise level (or approximation error), significantly tightening the analysis of Grunwald and Langford for and extending it to all other choices of .
View on arXiv@article{zhu2025_2503.02110, title={ Quantifying Overfitting along the Regularization Path for Two-Part-Code MDL in Supervised Classification }, author={ Xiaohan Zhu and Nathan Srebro }, journal={arXiv preprint arXiv:2503.02110}, year={ 2025 } }