ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1904.02291
52
4
v1v2v3v4 (latest)

Multinomial Concentration in Relative Entropy at the Ratio of Alphabet and Sample Sizes

4 April 2019
R. Agrawal
ArXiv (abs)PDFHTML
Abstract

We show that the moment generating function of the Kullback-Leibler divergence between the empirical distribution of nnn independent samples from a distribution PPP over a finite alphabet of size kkk (e.g. a multinomial distribution) and PPP itself is no more than that of a gamma distribution with shape k−1k - 1k−1 and rate nnn. The resulting exponential concentration inequality becomes meaningful (less than 1) when the divergence ε\varepsilonε is larger than (k−1)/n(k-1)/n(k−1)/n, whereas the standard method of types bound requires ε>1n⋅log⁡(n+k−1k−1)≥(k−1)/n⋅log⁡(1+n/(k−1))\varepsilon > \frac{1}{n} \cdot \log{\binom{n+k-1}{k-1}} \geq (k-1)/n \cdot \log(1 + n/(k-1))ε>n1​⋅log(k−1n+k−1​)≥(k−1)/n⋅log(1+n/(k−1)), thus saving a factor of order log⁡(n/k)\log(n/k)log(n/k) in the standard regime of parameters where n≫kn\gg kn≫k. Our proof proceeds via a simple reduction to the case k=2k = 2k=2 of a binary alphabet (e.g. a binomial distribution), and has the property that improvements in the case of k=2k = 2k=2 directly translate to improvements for general kkk.

View on arXiv
Comments on this paper