16
4

Non-Asymptotic Bounds for the \ell_{\infty} Estimator in Linear Regression with Uniform Noise

Abstract

The Chebyshev or \ell_{\infty} estimator is an unconventional alternative to the ordinary least squares in solving linear regressions. It is defined as the minimizer of the \ell_{\infty} objective function \begin{align*} \hat{\boldsymbol{\beta}} := \arg\min_{\boldsymbol{\beta}} \|\boldsymbol{Y} - \mathbf{X}\boldsymbol{\beta}\|_{\infty}. \end{align*} The asymptotic distribution of the Chebyshev estimator under fixed number of covariates was recently studied (Knight, 2020), yet finite sample guarantees and generalizations to high-dimensional settings remain open. In this paper, we develop non-asymptotic upper bounds on the estimation error β^β2\|\hat{\boldsymbol{\beta}}-\boldsymbol{\beta}^*\|_2 for a Chebyshev estimator β^\hat{\boldsymbol{\beta}}, in a regression setting with uniformly distributed noise εiU([a,a])\varepsilon_i\sim U([-a,a]) where aa is either known or unknown. With relatively mild assumptions on the (random) design matrix X\mathbf{X}, we can bound the error rate by Cpn\frac{C_p}{n} with high probability, for some constant CpC_p depending on the dimension pp and the law of the design. Furthermore, we illustrate that there exist designs for which the Chebyshev estimator is (nearly) minimax optimal. On the other hand we also argue that there exist designs for which this estimator behaves sub-optimally in terms of the constant CpC_p's dependence on pp. In addition we show that "Chebyshev's LASSO" has advantages over the regular LASSO in high dimensional situations, provided that the noise is uniform. Specifically, we argue that it achieves a much faster rate of estimation under certain assumptions on the growth rate of the sparsity level and the ambient dimension with respect to the sample size.

View on arXiv
Comments on this paper