The Nearest Neighbor Information Estimator is Adaptively Near Minimax Rate-Optimal

Abstract
We analyze the Kozachenko--Leonenko (KL) nearest neighbor estimator for the differential entropy. We obtain the first uniform upper bound on its performance over H\"older balls on a torus without assuming any conditions on how close the density could be from zero. Accompanying a new minimax lower bound over the H\"older ball, we show that the KL estimator is achieving the minimax rates up to logarithmic factors without cognizance of the smoothness parameter of the H\"older ball for and arbitrary dimension , rendering it the first estimator that provably satisfies this property.
View on arXivComments on this paper