16
0

Information Lower Bounds for Robust Mean Estimation

Abstract

We prove lower bounds on the error of any estimator for the mean of a real probability distribution under the knowledge that the distribution belongs to a given set. We apply these lower bounds both to parametric and nonparametric estimation. In the nonparametric case, we apply our results to the question of sub-Gaussian estimation for distributions with finite variance to obtain new lower bounds in the small error probability regime, and present an optimal estimator in that regime. In the (semi-)parametric case, we use the Fisher information to provide distribution-dependent lower bounds that are constant-tight asymptotically, of order 2log(1/δ)/(nI)\sqrt{2\log(1/\delta)/(nI)} where II is the Fisher information of the distribution. We use known minimizers of the Fisher information on some nonparametric set of distributions to give lower bounds in cases such as corrupted distributions, or bounded/semi-bounded distributions.

View on arXiv
Comments on this paper