36
5
v1v2v3 (latest)

A constrained risk inequality for general losses

Abstract

We provide a general constrained risk inequality that applies to arbitrary non-decreasing losses, extending a result of Brown and Low [Ann. Stat. 1996]. Given two distributions P0P_0 and P1P_1, we find a lower bound for the risk of estimating a parameter θ(P1)\theta(P_1) under P1P_1 given an upper bound on the risk of estimating the parameter θ(P0)\theta(P_0) under P0P_0. The inequality is a useful pedagogical tool, as its proof relies only on the Cauchy-Schwartz inequality, it applies to general losses, and it transparently gives risk lower bounds on super-efficient and adaptive estimators.

View on arXiv
Comments on this paper