ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1702.05222
20
45

Direct Estimation of Information Divergence Using Nearest Neighbor Ratios

17 February 2017
M. Noshad
Kevin R. Moon
Salimeh Yasaei Sekeh
Alfred Hero
ArXivPDFHTML
Abstract

We propose a direct estimation method for R\'{e}nyi and f-divergence measures based on a new graph theoretical interpretation. Suppose that we are given two sample sets XXX and YYY, respectively with NNN and MMM samples, where η:=M/N\eta:=M/Nη:=M/N is a constant value. Considering the kkk-nearest neighbor (kkk-NN) graph of YYY in the joint data set (X,Y)(X,Y)(X,Y), we show that the average powered ratio of the number of XXX points to the number of YYY points among all kkk-NN points is proportional to R\'{e}nyi divergence of XXX and YYY densities. A similar method can also be used to estimate f-divergence measures. We derive bias and variance rates, and show that for the class of γ\gammaγ-H\"{o}lder smooth functions, the estimator achieves the MSE rate of O(N−2γ/(γ+d))O(N^{-2\gamma/(\gamma+d)})O(N−2γ/(γ+d)). Furthermore, by using a weighted ensemble estimation technique, for density functions with continuous and bounded derivatives of up to the order ddd, and some extra conditions at the support set boundary, we derive an ensemble estimator that achieves the parametric MSE rate of O(1/N)O(1/N)O(1/N). Our estimators are more computationally tractable than other competing estimators, which makes them appealing in many practical applications.

View on arXiv
Comments on this paper