ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1909.01505
  4. Cited By
A Communication-Efficient Algorithm for Exponentially Fast Non-Bayesian
  Learning in Networks

A Communication-Efficient Algorithm for Exponentially Fast Non-Bayesian Learning in Networks

4 September 2019
A. Mitra
J. Richards
S. Sundaram
ArXiv (abs)PDFHTML

Papers citing "A Communication-Efficient Algorithm for Exponentially Fast Non-Bayesian Learning in Networks"

4 / 4 papers shown
Title
A New Approach for Distributed Hypothesis Testing with Extensions to
  Byzantine-Resilience
A New Approach for Distributed Hypothesis Testing with Extensions to Byzantine-Resilience
A. Mitra
J. Richards
S. Sundaram
109
20
0
14 Mar 2019
LAG: Lazily Aggregated Gradient for Communication-Efficient Distributed
  Learning
LAG: Lazily Aggregated Gradient for Communication-Efficient Distributed Learning
Tianyi Chen
G. Giannakis
Tao Sun
W. Yin
58
298
0
25 May 2018
Communication Optimality Trade-offs For Distributed Estimation
Communication Optimality Trade-offs For Distributed Estimation
Anit Kumar Sahu
D. Jakovetić
S. Kar
67
10
0
12 Jan 2018
Distributed Mean Estimation with Limited Communication
Distributed Mean Estimation with Limited Communication
A. Suresh
Felix X. Yu
Sanjiv Kumar
H. B. McMahan
FedML
135
366
0
02 Nov 2016
1