ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1503.05911
21
13
v1v2 (latest)

A note on entropy estimation

19 March 2015
Thomas Schürmann
ArXiv (abs)PDFHTML
Abstract

We compare an entropy estimator HzH_zHz​ recently discussed in [10] with two estimators H1H_1H1​ and H2H_2H2​ introduced in [6][7]. We prove the identity Hz≡H1H_z \equiv H_1Hz​≡H1​, which has not been taken into account in [10]. Then, we prove that the statistical bias of H1H_1H1​ is less than the bias of the ordinary likelihood estimator of entropy. Finally, by numerical simulation we verify that for the most interesting regime of small sample estimation and large event spaces, the estimator H2H_2H2​ has a significant smaller statistical error than HzH_zHz​.

View on arXiv
Comments on this paper