Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1512.03107
Cited By
RSG: Beating Subgradient Method without Smoothness and Strong Convexity
9 December 2015
Tianbao Yang
Qihang Lin
Re-assign community
ArXiv
PDF
HTML
Papers citing
"RSG: Beating Subgradient Method without Smoothness and Strong Convexity"
11 / 11 papers shown
Title
Some Primal-Dual Theory for Subgradient Methods for Strongly Convex Optimization
Benjamin Grimmer
Danlin Li
39
5
0
31 Dec 2024
Federated Learning on Adaptively Weighted Nodes by Bilevel Optimization
Yan Huang
Qihang Lin
N. Street
Stephen Seung-Yeob Baek
FedML
27
9
0
21 Jul 2022
Minimax risk classifiers with 0-1 loss
Santiago Mazuelas
Mauricio Romero
Peter Grünwald
32
6
0
17 Jan 2022
An Online Method for A Class of Distributionally Robust Optimization with Non-Convex Objectives
Qi Qi
Zhishuai Guo
Yi Tian Xu
R. L. Jin
Tianbao Yang
31
44
0
17 Jun 2020
Stochastic Iterative Hard Thresholding for Graph-structured Sparsity Optimization
Baojian Zhou
F. Chen
Yiming Ying
23
7
0
09 May 2019
Adaptive Accelerated Gradient Converging Methods under Holderian Error Bound Condition
Mingrui Liu
Tianbao Yang
34
15
0
23 Nov 2016
Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Łojasiewicz Condition
Hamed Karimi
J. Nutini
Mark W. Schmidt
139
1,199
0
16 Aug 2016
A Richer Theory of Convex Constrained Optimization with Reduced Projections and Improved Rates
Tianbao Yang
Qihang Lin
Lijun Zhang
19
25
0
11 Aug 2016
Homotopy Smoothing for Non-Smooth Problems with Lower Complexity than
O
(
1
/
ε
)
O(1/ε)
O
(
1/
ε
)
Yi Tian Xu
Yan Yan
Qihang Lin
Tianbao Yang
52
25
0
13 Jul 2016
Accelerate Stochastic Subgradient Method by Leveraging Local Growth Condition
Yi Tian Xu
Qihang Lin
Tianbao Yang
28
11
0
04 Jul 2016
A simpler approach to obtaining an O(1/t) convergence rate for the projected stochastic subgradient method
Simon Lacoste-Julien
Mark W. Schmidt
Francis R. Bach
126
259
0
10 Dec 2012
1