26
14

Convex Regression in Multidimensions: Suboptimality of Least Squares Estimators

Abstract

The least squares estimator (LSE) is shown to be suboptimal in squared error loss in the usual nonparametric regression model with Gaussian errors for d5d \geq 5 for each of the following families of functions: (i) convex functions supported on a polytope (in fixed design), (ii) bounded convex functions supported on a polytope (in random design), and (iii) convex Lipschitz functions supported on any convex domain (in random design). For each of these families, the risk of the LSE is proved to be of the order n2/dn^{-2/d} (up to logarithmic factors) while the minimax risk is n4/(d+4)n^{-4/(d+4)}, for d5d \ge 5. In addition, the first rate of convergence results (worst case and adaptive) for the full convex LSE are established for polytopal domains for all d1d \geq 1. Some new metric entropy results for convex functions are also proved which are of independent interest.

View on arXiv
Comments on this paper

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. See our policy.