ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2205.09804
23
8

Estimation of Entropy in Constant Space with Improved Sample Complexity

19 May 2022
Maryam Aliakbarpour
A. Mcgregor
Jelani Nelson
Erik Waingarten
ArXivPDFHTML
Abstract

Recent work of Acharya et al. (NeurIPS 2019) showed how to estimate the entropy of a distribution D\mathcal DD over an alphabet of size kkk up to ±ϵ\pm\epsilon±ϵ additive error by streaming over (k/ϵ3)⋅polylog(1/ϵ)(k/\epsilon^3) \cdot \text{polylog}(1/\epsilon)(k/ϵ3)⋅polylog(1/ϵ) i.i.d. samples and using only O(1)O(1)O(1) words of memory. In this work, we give a new constant memory scheme that reduces the sample complexity to (k/ϵ2)⋅polylog(1/ϵ)(k/\epsilon^2)\cdot \text{polylog}(1/\epsilon)(k/ϵ2)⋅polylog(1/ϵ). We conjecture that this is optimal up to polylog(1/ϵ)\text{polylog}(1/\epsilon)polylog(1/ϵ) factors.

View on arXiv
Comments on this paper