In this paper, we study two problems: (1) estimation of a -dimensional log-concave distribution and (2) bounded multivariate convex regression with random design with an underlying log-concave density or a compactly supported distribution with a continuous density. First, we show that for all the maximum likelihood estimators of both problems achieve an optimal risk of (up to a logarithmic factor) in terms of squared Hellinger distance and squared distance, respectively. Previously, the optimality of both these estimators was known only for . We also prove that the -entropy numbers of the two aforementioned families are equal up to logarithmic factors. We complement these results by proving a sharp bound on the minimax rate (up to logarithmic factors) with respect to the total variation distance. Finally, we prove that estimating a log-concave density - even a uniform distribution on a convex set - up to a fixed accuracy requires the number of samples \emph{at least} exponential in the dimension. We do that by improving the dimensional constant in the best known lower bound for the minimax rate from to (when ).
View on arXiv