60
179

Consistencies and rates of convergence of jump-penalized least squares estimators

Abstract

We study the asymptotics for jump-penalized least squares regression aiming at approximating a regression function by piecewise constant functions. Besides conventional consistency and convergence rates of the estimates in L2([0,1))L^2([0,1)) our results cover other metrics like Skorokhod metric on the space of c\`{a}dl\`{a}g functions and uniform metrics on C([0,1])C([0,1]). We will show that these estimators are in an adaptive sense rate optimal over certain classes of "approximation spaces." Special cases are the class of functions of bounded variation (piecewise) H\"{o}lder continuous functions of order 0<α10<\alpha\le1 and the class of step functions with a finite but arbitrary number of jumps. In the latter setting, we will also deduce the rates known from change-point analysis for detecting the jumps. Finally, the issue of fully automatic selection of the smoothing parameter is addressed.

View on arXiv
Comments on this paper