14
87

Univariate Mean Change Point Detection: Penalization, CUSUM and Optimality

Abstract

The problem of univariate mean change point detection and localization based on a sequence of nn independent observations with piecewise constant means has been intensively studied for more than half century, and serves as a blueprint for change point problems in more complex settings. We provide a complete characterization of this classical problem in a general framework in which the upper bound σ2\sigma^2 on the noise variance, the minimal spacing Δ\Delta between two consecutive change points and the minimal magnitude κ\kappa of the changes, are allowed to vary with nn. We first show that consistent localization of the change points, when the signal-to-noise ratio κΔσ<log(n)\frac{\kappa \sqrt{\Delta}}{\sigma} < \sqrt{\log(n)}, is impossible. In contrast, when κΔσ\frac{\kappa \sqrt{\Delta}}{\sigma} diverges with nn at the rate of at least log(n)\sqrt{\log(n)}, we demonstrate that two computationally-efficient change point estimators, one based on the solution to an 0\ell_0-penalized least squares problem and the other on the popular wild binary segmentation algorithm, are both consistent and achieve a localization rate of the order σ2κ2log(n)\frac{\sigma^2}{\kappa^2} \log(n). We further show that such rate is minimax optimal, up to a log(n)\log(n) term.

View on arXiv
Comments on this paper