58
4

Statistical inference for the slope parameter in functional linear regression

Abstract

In this paper we consider the linear regression model Y=SX+εY =S X+\varepsilon with functional regressors and responses. We develop new inference tools to quantify deviations of the true slope SS from a hypothesized operator S0S_0 with respect to the Hilbert--Schmidt norm SS02\| S- S_0\|^2, as well as the prediction error ESXS0X2\mathbb{E} \| S X - S_0 X \|^2. Our analysis is applicable to functional time series and based on asymptotically pivotal statistics. This makes it particularly user friendly, because it avoids the choice of tuning parameters inherent in long-run variance estimation or bootstrap of dependent data. We also discuss two sample problems as well as change point detection. Finite sample properties are investigated by means of a simulation study.\\ Mathematically our approach is based on a sequential version of the popular spectral cut-off estimator S^N\hat S_N for SS. It is well-known that the L2L^2-minimax rates in the functional regression model, both in estimation and prediction, are substantially slower than 1/N1/\sqrt{N} (where NN denotes the sample size) and that standard estimators for SS do not converge weakly to non-degenerate limits. However, we demonstrate that simple plug-in estimators - such as S^NS02\| \hat S_N - S_0 \|^2 for SS02\| S - S_0 \|^2 - are N\sqrt{N}-consistent and its sequential versions satisfy weak invariance principles. These results are based on the smoothing effect of L2L^2-norms and established by a new proof-technique, the {\it smoothness shift}, which has potential applications in other statistical inverse problems.

View on arXiv
Comments on this paper