124
84

Approximation by log-concave distributions, with applications to regression

Abstract

We study the approximation of arbitrary distributions PP on dd-dimensional space by distributions with log-concave density. Approximation means minimizing a Kullback--Leibler-type functional. We show that such an approximation exists if and only if PP has finite first moments and is not supported by some hyperplane. Furthermore we show that this approximation depends continuously on PP with respect to Mallows distance D1(,)D_1(\cdot,\cdot). This result implies consistency of the maximum likelihood estimator of a log-concave density under fairly general conditions. It also allows us to prove existence and consistency of estimators in regression models with a response Y=μ(X)+ϵY=\mu(X)+\epsilon, where XX and ϵ\epsilon are independent, μ()\mu(\cdot) belongs to a certain class of regression functions while ϵ\epsilon is a random error with log-concave density and mean zero.

View on arXiv
Comments on this paper