ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1902.02890
21
18

Lower Bounds for Learning Distributions under Communication Constraints via Fisher Information

7 February 2019
L. P. Barnes
Yanjun Han
Ayfer Özgür
ArXivPDFHTML
Abstract

We consider the problem of learning high-dimensional, nonparametric and structured (e.g. Gaussian) distributions in distributed networks, where each node in the network observes an independent sample from the underlying distribution and can use kkk bits to communicate its sample to a central processor. We consider three different models for communication. Under the independent model, each node communicates its sample to a central processor by independently encoding it into kkk bits. Under the more general sequential or blackboard communication models, nodes can share information interactively but each node is restricted to write at most kkk bits on the final transcript. We characterize the impact of the communication constraint kkk on the minimax risk of estimating the underlying distribution under ℓ2\ell^2ℓ2 loss. We develop minimax lower bounds that apply in a unified way to many common statistical models and reveal that the impact of the communication constraint can be qualitatively different depending on the tail behavior of the score function associated with each model. A key ingredient in our proofs is a geometric characterization of Fisher information from quantized samples.

View on arXiv
Comments on this paper