ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.02275
  4. Cited By
Sharper Rates and Flexible Framework for Nonconvex SGD with Client and
  Data Sampling

Sharper Rates and Flexible Framework for Nonconvex SGD with Client and Data Sampling

5 June 2022
Alexander Tyurin
Lukang Sun
Konstantin Burlachenko
Peter Richtárik
ArXiv (abs)PDFHTML

Papers citing "Sharper Rates and Flexible Framework for Nonconvex SGD with Client and Data Sampling"

7 / 7 papers shown
Title
BurTorch: Revisiting Training from First Principles by Coupling Autodiff, Math Optimization, and Systems
BurTorch: Revisiting Training from First Principles by Coupling Autodiff, Math Optimization, and Systems
Konstantin Burlachenko
Peter Richtárik
AI4CE
71
0
0
18 Mar 2025
Tighter Performance Theory of FedExProx
Tighter Performance Theory of FedExProx
Wojciech Anyszka
Kaja Gruntkowska
Alexander Tyurin
Peter Richtárik
FedML
57
0
0
20 Oct 2024
SPAM: Stochastic Proximal Point Method with Momentum Variance Reduction
  for Non-convex Cross-Device Federated Learning
SPAM: Stochastic Proximal Point Method with Momentum Variance Reduction for Non-convex Cross-Device Federated Learning
Avetik G. Karagulyan
Egor Shulgin
Abdurakhmon Sadiev
Peter Richtárik
FedML
103
3
0
30 May 2024
Freya PAGE: First Optimal Time Complexity for Large-Scale Nonconvex
  Finite-Sum Optimization with Heterogeneous Asynchronous Computations
Freya PAGE: First Optimal Time Complexity for Large-Scale Nonconvex Finite-Sum Optimization with Heterogeneous Asynchronous Computations
Alexander Tyurin
Kaja Gruntkowska
Peter Richtárik
82
3
0
24 May 2024
Correlated Quantization for Faster Nonconvex Distributed Optimization
Correlated Quantization for Faster Nonconvex Distributed Optimization
Andrei Panferov
Yury Demidovich
Ahmad Rammal
Peter Richtárik
MQ
122
4
0
10 Jan 2024
Improving Accelerated Federated Learning with Compression and Importance
  Sampling
Improving Accelerated Federated Learning with Compression and Importance Sampling
Michal Grudzieñ
Grigory Malinovsky
Peter Richtárik
FedML
86
11
0
05 Jun 2023
Can 5th Generation Local Training Methods Support Client Sampling? Yes!
Can 5th Generation Local Training Methods Support Client Sampling? Yes!
Michal Grudzieñ
Grigory Malinovsky
Peter Richtárik
111
29
0
29 Dec 2022
1