ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1901.09100
59
23
v1v2 (latest)

Communication Complexity of Estimating Correlations

25 January 2019
U. Hadar
Jingbo Liu
Yury Polyanskiy
O. Shayevitz
ArXiv (abs)PDFHTML
Abstract

We characterize the communication complexity of the following distributed estimation problem. Alice and Bob observe infinitely many iid copies of ρ\rhoρ-correlated unit-variance (Gaussian or ±1\pm1±1 binary) random variables, with unknown ρ∈[−1,1]\rho\in[-1,1]ρ∈[−1,1]. By interactively exchanging kkk bits, Bob wants to produce an estimate ρ^\hat\rhoρ^​ of ρ\rhoρ. We show that the best possible performance (optimized over interaction protocol Π\PiΠ and estimator ρ^\hat \rhoρ^​) satisfies inf⁡Πρ^sup⁡ρE[∣ρ−ρ^∣2]=1k(12ln⁡2+o(1))\inf_{\Pi \hat\rho}\sup_\rho \mathbb{E} [|\rho-\hat\rho|^2] = \tfrac{1}{k} (\frac{1}{2 \ln 2} + o(1))infΠρ^​​supρ​E[∣ρ−ρ^​∣2]=k1​(2ln21​+o(1)). Curiously, the number of samples in our achievability scheme is exponential in kkk; by contrast, a naive scheme exchanging kkk samples achieves the same Ω(1/k)\Omega(1/k)Ω(1/k) rate but with a suboptimal prefactor. Our protocol achieving optimal performance is one-way (non-interactive). We also prove the Ω(1/k)\Omega(1/k)Ω(1/k) bound even when ρ\rhoρ is restricted to any small open sub-interval of [−1,1][-1,1][−1,1] (i.e. a local minimax lower bound). Our proof techniques rely on symmetric strong data-processing inequalities and various tensorization techniques from information-theoretic interactive common-randomness extraction. Our results also imply an Ω(n)\Omega(n)Ω(n) lower bound on the information complexity of the Gap-Hamming problem, for which we show a direct information-theoretic proof.

View on arXiv
Comments on this paper