61
47

On Bayes Risk Lower Bounds

Abstract

This paper provides a general technique to lower bound the Bayes risk for arbitrary loss functions and prior distributions in the standard abstract decision theoretic setting. A lower bound on the Bayes risk not only serves as a lower bound on the minimax risk but also characterizes the fundamental limitations of the statistical difficulty of a decision problem under a given prior. Our bounds are based on the notion of ff-informativity of the underlying class of probability measures and the prior. Application of our bounds requires upper bounds on the ff-informativity and we derive new upper bounds on ff-informativity for a class of ff functions which lead to tight Bayes risk lower bounds. Our technique leads to generalizations of a variety of classical minimax bounds. As applications, we present Bayes risk lower bounds for several concrete estimation problems, including Gaussian location models, Bayesian Lasso, generalized linear models and principle component analysis for spiked covariance models.

View on arXiv
Comments on this paper