29
105

Global rates of convergence in log-concave density estimation

Abstract

The estimation of a log-concave density on Rd\mathbb{R}^d represents a central problem in the area of nonparametric inference under shape constraints. In this paper, we study the performance of log-concave density estimators with respect to global loss functions, and adopt a minimax approach. We first show that no statistical procedure based on a sample of size nn can estimate a log-concave density with respect to the squared Hellinger loss function with supremum risk smaller than order n4/5n^{-4/5}, when d=1d=1, and order n2/(d+1)n^{-2/(d+1)} when d2d \geq 2. In particular, this reveals a sense in which, when d3d \geq 3, log-concave density estimation is fundamentally more challenging than the estimation of a density with two bounded derivatives (a problem to which it has been compared). Second, we show that for d3d \leq 3, the Hellinger ϵ\epsilon-bracketing entropy of a class of log-concave densities with small mean and covariance matrix close to the identity grows like max{ϵd/2,ϵ(d1)}\max\{\epsilon^{-d/2},\epsilon^{-(d-1)}\} (up to a logarithmic factor when d=2d=2). This enables us to prove that when d3d \leq 3 the log-concave maximum likelihood estimator achieves the minimax optimal rate (up to logarithmic factors when d=2,3d = 2,3) with respect to squared Hellinger loss.

View on arXiv
Comments on this paper