ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2106.03524
  4. Cited By
Theoretically Better and Numerically Faster Distributed Optimization
  with Smoothness-Aware Quantization Techniques

Theoretically Better and Numerically Faster Distributed Optimization with Smoothness-Aware Quantization Techniques

7 June 2021
Bokun Wang
M. Safaryan
Peter Richtárik
    MQ
ArXivPDFHTML

Papers citing "Theoretically Better and Numerically Faster Distributed Optimization with Smoothness-Aware Quantization Techniques"

2 / 2 papers shown
Title
GradSkip: Communication-Accelerated Local Gradient Methods with Better
  Computational Complexity
GradSkip: Communication-Accelerated Local Gradient Methods with Better Computational Complexity
Artavazd Maranjyan
M. Safaryan
Peter Richtárik
39
13
0
28 Oct 2022
NUQSGD: Provably Communication-efficient Data-parallel SGD via Nonuniform Quantization
Ali Ramezani-Kebrya
Fartash Faghri
Ilya Markov
V. Aksenov
Dan Alistarh
Daniel M. Roy
MQ
65
31
0
28 Apr 2021
1