Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1808.03880
Cited By
Parallelization does not Accelerate Convex Optimization: Adaptivity Lower Bounds for Non-smooth Convex Minimization
12 August 2018
Eric Balkanski
Yaron Singer
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Parallelization does not Accelerate Convex Optimization: Adaptivity Lower Bounds for Non-smooth Convex Minimization"
10 / 10 papers shown
Title
Memory-Query Tradeoffs for Randomized Convex Optimization
Xinyu Chen
Binghui Peng
36
6
0
21 Jun 2023
Lower Bounds and Accelerated Algorithms in Distributed Stochastic Optimization with Communication Compression
Yutong He
Xinmeng Huang
Yiming Chen
W. Yin
Kun Yuan
28
7
0
12 May 2023
ReSQueing Parallel and Private Stochastic Convex Optimization
Y. Carmon
A. Jambulapati
Yujia Jin
Y. Lee
Daogao Liu
Aaron Sidford
Kevin Tian
FedML
22
12
0
01 Jan 2023
Efficient Convex Optimization Requires Superlinear Memory
A. Marsden
Vatsal Sharan
Aaron Sidford
Gregory Valiant
29
14
0
29 Mar 2022
Lower Bounds and Optimal Algorithms for Personalized Federated Learning
Filip Hanzely
Slavomír Hanzely
Samuel Horváth
Peter Richtárik
FedML
41
186
0
05 Oct 2020
Optimal Complexity in Decentralized Training
Yucheng Lu
Christopher De Sa
38
71
0
15 Jun 2020
Complexity of Highly Parallel Non-Smooth Convex Optimization
Sébastien Bubeck
Qijia Jiang
Y. Lee
Yuanzhi Li
Aaron Sidford
23
55
0
25 Jun 2019
Lower Bounds for Parallel and Randomized Convex Optimization
Jelena Diakonikolas
Cristóbal Guzmán
33
44
0
05 Nov 2018
Stochastic Gradient Descent for Non-smooth Optimization: Convergence Results and Optimal Averaging Schemes
Ohad Shamir
Tong Zhang
101
570
0
08 Dec 2012
Optimal Distributed Online Prediction using Mini-Batches
O. Dekel
Ran Gilad-Bachrach
Ohad Shamir
Lin Xiao
177
683
0
07 Dec 2010
1