ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1907.11612
  4. Cited By
Taming Momentum in a Distributed Asynchronous Environment

Taming Momentum in a Distributed Asynchronous Environment

26 July 2019
Ido Hakimi
Saar Barkai
Moshe Gabel
Assaf Schuster
ArXivPDFHTML

Papers citing "Taming Momentum in a Distributed Asynchronous Environment"

6 / 6 papers shown
Title
Nesterov Method for Asynchronous Pipeline Parallel Optimization
Nesterov Method for Asynchronous Pipeline Parallel Optimization
Thalaiyasingam Ajanthan
Sameera Ramasinghe
Yan Zuo
Gil Avraham
Alexander Long
26
0
0
02 May 2025
Reducing Impacts of System Heterogeneity in Federated Learning using
  Weight Update Magnitudes
Reducing Impacts of System Heterogeneity in Federated Learning using Weight Update Magnitudes
Irene Wang
32
1
0
30 Aug 2022
FedAdapt: Adaptive Offloading for IoT Devices in Federated Learning
FedAdapt: Adaptive Offloading for IoT Devices in Federated Learning
Di Wu
R. Ullah
P. Harvey
Peter Kilpatrick
I. Spence
Blesson Varghese
42
78
0
09 Jul 2021
Pipelined Backpropagation at Scale: Training Large Models without
  Batches
Pipelined Backpropagation at Scale: Training Large Models without Batches
Atli Kosson
Vitaliy Chiley
Abhinav Venigalla
Joel Hestness
Urs Koster
35
33
0
25 Mar 2020
Distributed Hierarchical GPU Parameter Server for Massive Scale Deep
  Learning Ads Systems
Distributed Hierarchical GPU Parameter Server for Massive Scale Deep Learning Ads Systems
Weijie Zhao
Deping Xie
Ronglai Jia
Yulei Qian
Rui Ding
Mingming Sun
P. Li
MoE
59
150
0
12 Mar 2020
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
308
2,890
0
15 Sep 2016
1