18
70

Arimoto-Rényi Conditional Entropy and Bayesian MM-ary Hypothesis Testing

Abstract

This paper gives upper and lower bounds on the minimum error probability of Bayesian MM-ary hypothesis testing in terms of the Arimoto-R\ényi conditional entropy of an arbitrary order α\alpha. The improved tightness of these bounds over their specialized versions with the Shannon conditional entropy (α=1\alpha=1) is demonstrated. In particular, in the case where MM is finite, we show how to generalize Fano's inequality under both the conventional and list-decision settings. As a counterpart to the generalized Fano's inequality, allowing MM to be infinite, a lower bound on the Arimoto-R\ényi conditional entropy is derived as a function of the minimum error probability. Explicit upper and lower bounds on the minimum error probability are obtained as a function of the Arimoto-R\ényi conditional entropy for both positive and negative α\alpha. Furthermore, we give upper bounds on the minimum error probability as functions of the R\ényi divergence. In the setup of discrete memoryless channels, we analyze the exponentially vanishing decay of the Arimoto-R\ényi conditional entropy of the transmitted codeword given the channel output when averaged over a random coding ensemble.

View on arXiv
Comments on this paper