Consistencies and rates of convergence of jump-penalized least squares estimators

We study the asymptotics for jump-penalized least squares regression aiming at approximating a regression function by piecewise constant functions. Besides conventional consistency and convergence rates of the estimates in our results cover other metrics like Skorokhod metric on the space of c\`{a}dl\`{a}g functions and uniform metrics on . We will show that these estimators are in an adaptive sense rate optimal over certain classes of "approximation spaces." Special cases are the class of functions of bounded variation (piecewise) H\"{o}lder continuous functions of order and the class of step functions with a finite but arbitrary number of jumps. In the latter setting, we will also deduce the rates known from change-point analysis for detecting the jumps. Finally, the issue of fully automatic selection of the smoothing parameter is addressed.
View on arXiv