61
47

On Bayes Risk Lower Bounds

Abstract

This paper provides a general technique to lower bound the Bayes risk for arbitrary loss functions and prior distributions in the standard abstract decision theoretic setting. A lower bound on the Bayes risk not only serves as a lower bound on the minimax risk but also characterizes the fundamental limitations of the statistical difficulty of a decision problem under a given prior. Our bounds are based on the notion of ff-informativity of the underlying class of probability measures and the prior. Application of our bounds requires upper bounds on the ff-informativity and we derive new upper bounds on ff-informativity for a class of ff functions which lead to tight Bayes risk lower bounds. Our technique leads to generalizations of a variety of classical minimax bounds (e.g., generalized Fano's inequality). Using our Bayes risk lower bound, we provide a succinct proof to the main result of Chatterjee [2014]: for estimating mean of a Gaussian random vector under convex constraint, least squares estimator is always admissible up to a constant.

View on arXiv
Comments on this paper