ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2007.00878
  4. Cited By
On the Outsized Importance of Learning Rates in Local Update Methods

On the Outsized Importance of Learning Rates in Local Update Methods

2 July 2020
Zachary B. Charles
Jakub Konecný
    FedML
ArXiv (abs)PDFHTML

Papers citing "On the Outsized Importance of Learning Rates in Local Update Methods"

5 / 55 papers shown
Title
Optimization Methods for Large-Scale Machine Learning
Optimization Methods for Large-Scale Machine Learning
Léon Bottou
Frank E. Curtis
J. Nocedal
252
3,226
0
15 Jun 2016
Communication-Efficient Learning of Deep Networks from Decentralized
  Data
Communication-Efficient Learning of Deep Networks from Decentralized Data
H. B. McMahan
Eider Moore
Daniel Ramage
S. Hampson
Blaise Agüera y Arcas
FedML
408
17,615
0
17 Feb 2016
Adam: A Method for Stochastic Optimization
Adam: A Method for Stochastic Optimization
Diederik P. Kingma
Jimmy Ba
ODL
2.1K
150,364
0
22 Dec 2014
One weird trick for parallelizing convolutional neural networks
One weird trick for parallelizing convolutional neural networks
A. Krizhevsky
GNN
93
1,303
0
23 Apr 2014
Making Gradient Descent Optimal for Strongly Convex Stochastic
  Optimization
Making Gradient Descent Optimal for Strongly Convex Stochastic Optimization
Alexander Rakhlin
Ohad Shamir
Karthik Sridharan
178
769
0
26 Sep 2011
Previous
12