Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1603.00570
Cited By
v1
v2
v3 (latest)
Without-Replacement Sampling for Stochastic Gradient Methods: Convergence Results and Application to Distributed Optimization
2 March 2016
Ohad Shamir
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Without-Replacement Sampling for Stochastic Gradient Methods: Convergence Results and Application to Distributed Optimization"
10 / 10 papers shown
Title
Introduction to Online Convex Optimization
Elad Hazan
OffRL
175
1,932
0
07 Sep 2019
An Introduction to Matrix Concentration Inequalities
J. Tropp
168
1,154
0
07 Jan 2015
Communication-Efficient Distributed Dual Coordinate Ascent
Martin Jaggi
Virginia Smith
Martin Takáč
Jonathan Terhorst
S. Krishnan
Thomas Hofmann
Michael I. Jordan
91
353
0
04 Sep 2014
SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives
Aaron Defazio
Francis R. Bach
Simon Lacoste-Julien
ODL
133
1,826
0
01 Jul 2014
A simpler approach to obtaining an O(1/t) convergence rate for the projected stochastic subgradient method
Simon Lacoste-Julien
Mark Schmidt
Francis R. Bach
183
260
0
10 Dec 2012
Stochastic Gradient Descent for Non-smooth Optimization: Convergence Results and Optimal Averaging Schemes
Ohad Shamir
Tong Zhang
151
574
0
08 Dec 2012
Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
Shai Shalev-Shwartz
Tong Zhang
184
1,033
0
10 Sep 2012
Beneath the valley of the noncommutative arithmetic-geometric mean inequality: conjectures, case-studies, and consequences
Benjamin Recht
Christopher Ré
64
51
0
19 Feb 2012
A Reliable Effective Terascale Linear Learning System
Alekh Agarwal
O. Chapelle
Miroslav Dudík
John Langford
94
418
0
19 Oct 2011
Better Mini-Batch Algorithms via Accelerated Gradient Methods
Andrew Cotter
Ohad Shamir
Nathan Srebro
Karthik Sridharan
ODL
120
315
0
22 Jun 2011
1