We characterize the communication complexity of the following distributed estimation problem. Alice and Bob observe infinitely many iid copies of -correlated unit-variance (Gaussian or binary) random variables, with unknown . By interactively exchanging bits, Bob wants to produce an estimate of . We show that the best possible performance (optimized over interaction protocol and estimator ) satisfies . Curiously, the number of samples in our achievability scheme is exponential in ; by contrast, a naive scheme exchanging samples achieves the same rate but with a suboptimal prefactor. Our protocol achieving optimal performance is one-way (non-interactive). We also prove the bound even when is restricted to any small open sub-interval of (i.e. a local minimax lower bound). Our proof techniques rely on symmetric strong data-processing inequalities and various tensorization techniques from information-theoretic interactive common-randomness extraction. Our results also imply an lower bound on the information complexity of the Gap-Hamming problem, for which we show a direct information-theoretic proof.
View on arXiv