Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2206.02275
Cited By
Sharper Rates and Flexible Framework for Nonconvex SGD with Client and Data Sampling
5 June 2022
Alexander Tyurin
Lukang Sun
Konstantin Burlachenko
Peter Richtárik
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Sharper Rates and Flexible Framework for Nonconvex SGD with Client and Data Sampling"
5 / 5 papers shown
Title
Correlated Quantization for Faster Nonconvex Distributed Optimization
Andrei Panferov
Yury Demidovich
Ahmad Rammal
Peter Richtárik
MQ
51
4
0
10 Jan 2024
Can 5th Generation Local Training Methods Support Client Sampling? Yes!
Michal Grudzieñ
Grigory Malinovsky
Peter Richtárik
37
29
0
29 Dec 2022
FL_PyTorch: optimization research simulator for federated learning
Konstantin Burlachenko
Samuel Horváth
Peter Richtárik
FedML
53
18
0
07 Feb 2022
Permutation Compressors for Provably Faster Distributed Nonconvex Optimization
Rafal Szlendak
Alexander Tyurin
Peter Richtárik
135
35
0
07 Oct 2021
Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization
Samuel Horváth
Lihua Lei
Peter Richtárik
Michael I. Jordan
57
30
0
13 Feb 2020
1