ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.08014
  4. Cited By
Communication-Efficient Distributed SVD via Local Power Iterations

Communication-Efficient Distributed SVD via Local Power Iterations

19 February 2020
Xiang Li
Shusen Wang
Kun Chen
Zhihua Zhang
ArXivPDFHTML

Papers citing "Communication-Efficient Distributed SVD via Local Power Iterations"

6 / 6 papers shown
Title
Optimal Client Sampling for Federated Learning
Optimal Client Sampling for Federated Learning
Jiajun He
Samuel Horváth
Peter Richtárik
FedML
61
195
0
26 Oct 2020
Communication-efficient distributed eigenspace estimation
Communication-efficient distributed eigenspace estimation
Vasileios Charisopoulos
Austin R. Benson
Anil Damle
16
10
0
05 Sep 2020
Cooperative SGD: A unified Framework for the Design and Analysis of
  Communication-Efficient SGD Algorithms
Cooperative SGD: A unified Framework for the Design and Analysis of Communication-Efficient SGD Algorithms
Jianyu Wang
Gauri Joshi
101
348
0
22 Aug 2018
Local SGD Converges Fast and Communicates Little
Local SGD Converges Fast and Communicates Little
Sebastian U. Stich
FedML
152
1,056
0
24 May 2018
Scalable Kernel K-Means Clustering with Nystrom Approximation:
  Relative-Error Bounds
Scalable Kernel K-Means Clustering with Nystrom Approximation: Relative-Error Bounds
Shusen Wang
Alex Gittens
Michael W. Mahoney
57
128
0
09 Jun 2017
Revisiting the Nystrom Method for Improved Large-Scale Machine Learning
Revisiting the Nystrom Method for Improved Large-Scale Machine Learning
Alex Gittens
Michael W. Mahoney
88
414
0
07 Mar 2013
1