Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2008.11343
Cited By
APMSqueeze: A Communication Efficient Adam-Preconditioned Momentum SGD Algorithm
26 August 2020
Hanlin Tang
Shaoduo Gan
Samyam Rajbhandari
Xiangru Lian
Ji Liu
Yuxiong He
Ce Zhang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"APMSqueeze: A Communication Efficient Adam-Preconditioned Momentum SGD Algorithm"
5 / 5 papers shown
Title
PowerSGD: Practical Low-Rank Gradient Compression for Distributed Optimization
Thijs Vogels
Sai Praneeth Karimireddy
Martin Jaggi
56
320
0
31 May 2019
DoubleSqueeze: Parallel Stochastic Gradient Descent with Double-Pass Error-Compensated Compression
Hanlin Tang
Xiangru Lian
Chen Yu
Tong Zhang
Ji Liu
33
217
0
15 May 2019
Adaptive Gradient Methods with Dynamic Bound of Learning Rate
Liangchen Luo
Yuanhao Xiong
Yan Liu
Xu Sun
ODL
48
600
0
26 Feb 2019
Asynchronous Stochastic Gradient Descent with Delay Compensation
Shuxin Zheng
Qi Meng
Taifeng Wang
Wei Chen
Nenghai Yu
Zhiming Ma
Tie-Yan Liu
86
314
0
27 Sep 2016
ADADELTA: An Adaptive Learning Rate Method
Matthew D. Zeiler
ODL
115
6,619
0
22 Dec 2012
1