Characterizing the minimax rate of nonparametric regression under bounded convex constraints

We quantify the minimax rate for a nonparametric regression model over a convex function class with bounded diameter. We obtain a minimax rate of where \[\varepsilon^{\ast} =\sup\{\varepsilon>0:n\varepsilon^2 \le \log M_{\mathcal{F}}^{\operatorname{loc}}(\varepsilon,c)\},\] where is the local metric entropy of and our loss function is the squared population distance over our input space . In contrast to classical works on the topic [cf. Yang and Barron, 1999], our results do not require functions in to be uniformly bounded in sup-norm. In addition, we prove that our estimator is adaptive to the true point, and to the best of our knowledge this is the first such estimator in this general setting. This work builds on the Gaussian sequence framework of Neykov [2022] using a similar algorithmic scheme to achieve the minimax rate. Our algorithmic rate also applies with sub-Gaussian noise. We illustrate the utility of this theory with examples including multivariate monotone functions, linear functionals over ellipsoids, and Lipschitz classes.
View on arXiv