282

Minimax Optimal Estimation of KL Divergence for Continuous Distributions

IEEE Transactions on Information Theory (IEEE Trans. Inf. Theory), 2020
Abstract

Estimating Kullback-Leibler divergence from identical and independently distributed samples is an important problem in various domains. One simple and effective estimator is based on the k nearest neighbor distances between these samples. In this paper, we analyze the convergence rates of the bias and variance of this estimator. Furthermore, we derive a lower bound of the minimax mean square error and show that kNN method is asymptotically rate optimal.

View on arXiv
Comments on this paper