56
1

EarlyStopping: Implicit Regularization for Iterative Learning Procedures in Python

Abstract

Iterative learning procedures are ubiquitous in machine learning and modern statistics.Regularision is typically required to prevent inflating the expected loss of a procedure inlater iterations via the propagation of noise inherent in the data.Significant emphasis has been placed on achieving this regularisation implicitly by stoppingprocedures early.The EarlyStopping-package provides a toolbox of (in-sample) sequential early stopping rules forseveral well-known iterative estimation procedures, such as truncated SVD, Landweber (gradientdescent), conjugate gradient descent, L2-boosting and regression trees.One of the central features of the package is that the algorithms allow the specification of thetrue data-generating process and keep track of relevant theoretical quantities.In this paper, we detail the principles governing the implementation of the EarlyStopping-package and providea survey of recent foundational advances in the theoretical literature.We demonstrate how to use the EarlyStopping-package to explore core features of implicit regularisationand replicate results from the literature.

View on arXiv
@article{ziebell2025_2503.16753,
  title={ EarlyStopping: Implicit Regularization for Iterative Learning Procedures in Python },
  author={ Eric Ziebell and Ratmir Miftachov and Bernhard Stankewitz and Laura Hucker },
  journal={arXiv preprint arXiv:2503.16753},
  year={ 2025 }
}
Comments on this paper