11
6

A Test for Independence Via Bayesian Nonparametric Estimation of Mutual Information

Abstract

Mutual information is a well-known tool to measure the mutual dependence between variables. In this paper, a Bayesian nonparametric estimation of mutual information is established by means of the Dirichlet process and the kk-nearest neighbor distance. As a direct outcome of the estimation, an easy-to-implement test of independence is introduced through the relative belief ratio. Several theoretical properties of the approach are presented. The procedure is investigated through various examples where the results are compared to its frequentist counterpart and demonstrate a good performance.

View on arXiv
Comments on this paper