A class of estimators of the R\'{e}nyi and Tsallis entropies of an unknown distribution in is presented. These estimators are based on the th nearest-neighbor distances computed from a sample of i.i.d. vectors with distribution . We show that entropies of any order , including Shannon's entropy, can be estimated consistently with minimal assumptions on . Moreover, we show that it is straightforward to extend the nearest-neighbor method to estimate the statistical distance between two distributions using one i.i.d. sample from each. (Wit Correction.)
View on arXiv