ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1711.02141
22
73

Optimal rates of entropy estimation over Lipschitz balls

6 November 2017
Yanjun Han
Jiantao Jiao
Tsachy Weissman
Yihong Wu
ArXivPDFHTML
Abstract

We consider the problem of minimax estimation of the entropy of a density over Lipschitz balls. Dropping the usual assumption that the density is bounded away from zero, we obtain the minimax rates (nln⁡n)−s/(s+d)+n−1/2(n\ln n)^{-s/(s+d)} + n^{-1/2}(nlnn)−s/(s+d)+n−1/2 for 0<s≤20<s\leq 20<s≤2 for densities supported on [0,1]d[0,1]^d[0,1]d, where sss is the smoothness parameter and nnn is the number of independent samples. We generalize the results to densities with unbounded support: given an Orlicz functions Ψ\PsiΨ of rapid growth (such as the sub-exponential and sub-Gaussian classes), the minimax rates for densities with bounded Ψ\PsiΨ-Orlicz norm increase to (nln⁡n)−s/(s+d)(Ψ−1(n))d(1−d/p(s+d))+n−1/2(n\ln n)^{-s/(s+d)} (\Psi^{-1}(n))^{d(1-d/p(s+d))} + n^{-1/2}(nlnn)−s/(s+d)(Ψ−1(n))d(1−d/p(s+d))+n−1/2, where ppp is the norm parameter in the Lipschitz ball. We also show that the integral-form plug-in estimators with kernel density estimates fail to achieve the minimax rates, and characterize their worst case performances over the Lipschitz ball. One of the key steps in analyzing the bias relies on a novel application of the Hardy-Littlewood maximal inequality, which also leads to a new inequality on the Fisher information that may be of independent interest.

View on arXiv
Comments on this paper