ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.02275
  4. Cited By
Sharper Rates and Flexible Framework for Nonconvex SGD with Client and
  Data Sampling

Sharper Rates and Flexible Framework for Nonconvex SGD with Client and Data Sampling

5 June 2022
Alexander Tyurin
Lukang Sun
Konstantin Burlachenko
Peter Richtárik
ArXivPDFHTML

Papers citing "Sharper Rates and Flexible Framework for Nonconvex SGD with Client and Data Sampling"

5 / 5 papers shown
Title
Correlated Quantization for Faster Nonconvex Distributed Optimization
Correlated Quantization for Faster Nonconvex Distributed Optimization
Andrei Panferov
Yury Demidovich
Ahmad Rammal
Peter Richtárik
MQ
51
4
0
10 Jan 2024
Can 5th Generation Local Training Methods Support Client Sampling? Yes!
Can 5th Generation Local Training Methods Support Client Sampling? Yes!
Michal Grudzieñ
Grigory Malinovsky
Peter Richtárik
37
29
0
29 Dec 2022
FL_PyTorch: optimization research simulator for federated learning
FL_PyTorch: optimization research simulator for federated learning
Konstantin Burlachenko
Samuel Horváth
Peter Richtárik
FedML
53
18
0
07 Feb 2022
Permutation Compressors for Provably Faster Distributed Nonconvex
  Optimization
Permutation Compressors for Provably Faster Distributed Nonconvex Optimization
Rafal Szlendak
Alexander Tyurin
Peter Richtárik
135
35
0
07 Oct 2021
Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization
Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization
Samuel Horváth
Lihua Lei
Peter Richtárik
Michael I. Jordan
57
30
0
13 Feb 2020
1