ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1106.4574
  4. Cited By
Better Mini-Batch Algorithms via Accelerated Gradient Methods

Better Mini-Batch Algorithms via Accelerated Gradient Methods

22 June 2011
Andrew Cotter
Ohad Shamir
Nathan Srebro
Karthik Sridharan
    ODL
ArXivPDFHTML

Papers citing "Better Mini-Batch Algorithms via Accelerated Gradient Methods"

5 / 5 papers shown
Title
Ordered Momentum for Asynchronous SGD
Ordered Momentum for Asynchronous SGD
Chang-Wei Shi
Yi-Rui Yang
Wu-Jun Li
ODL
123
0
0
27 Jul 2024
SARAH: A Novel Method for Machine Learning Problems Using Stochastic
  Recursive Gradient
SARAH: A Novel Method for Machine Learning Problems Using Stochastic Recursive Gradient
Lam M. Nguyen
Jie Liu
K. Scheinberg
Martin Takáč
ODL
154
601
0
01 Mar 2017
Distributed Delayed Stochastic Optimization
Distributed Delayed Stochastic Optimization
Alekh Agarwal
John C. Duchi
119
626
0
28 Apr 2011
Optimal Distributed Online Prediction using Mini-Batches
Optimal Distributed Online Prediction using Mini-Batches
O. Dekel
Ran Gilad-Bachrach
Ohad Shamir
Lin Xiao
249
683
0
07 Dec 2010
Optimistic Rates for Learning with a Smooth Loss
Optimistic Rates for Learning with a Smooth Loss
Nathan Srebro
Karthik Sridharan
Ambuj Tewari
153
282
0
20 Sep 2010
1