ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2011.00667
  4. Cited By
Asynchronous Parallel Stochastic Quasi-Newton Methods

Asynchronous Parallel Stochastic Quasi-Newton Methods

2 November 2020
Qianqian Tong
Guannan Liang
Xingyu Cai
Chunjiang Zhu
J. Bi
    ODL
ArXiv (abs)PDFHTML

Papers citing "Asynchronous Parallel Stochastic Quasi-Newton Methods"

22 / 22 papers shown
Title
Non-asymptotic Superlinear Convergence of Standard Quasi-Newton Methods
Non-asymptotic Superlinear Convergence of Standard Quasi-Newton Methods
Qiujiang Jin
Aryan Mokhtari
51
42
0
30 Mar 2020
Fast and Furious Convergence: Stochastic Second Order Methods under
  Interpolation
Fast and Furious Convergence: Stochastic Second Order Methods under Interpolation
S. Meng
Sharan Vaswani
I. Laradji
Mark Schmidt
Simon Lacoste-Julien
79
34
0
11 Oct 2019
Globally Convergent Newton Methods for Ill-conditioned Generalized
  Self-concordant Losses
Globally Convergent Newton Methods for Ill-conditioned Generalized Self-concordant Losses
Ulysse Marteau-Ferey
Francis R. Bach
Alessandro Rudi
54
36
0
03 Jul 2019
Global linear convergence of Newton's method without strong-convexity or
  Lipschitz gradients
Global linear convergence of Newton's method without strong-convexity or Lipschitz gradients
Sai Praneeth Karimireddy
Sebastian U. Stich
Martin Jaggi
76
52
0
01 Jun 2018
A Progressive Batching L-BFGS Method for Machine Learning
A Progressive Batching L-BFGS Method for Machine Learning
Raghu Bollapragada
Dheevatsa Mudigere
J. Nocedal
Hao-Jun Michael Shi
P. T. P. Tang
ODL
82
153
0
15 Feb 2018
Stochastic L-BFGS: Improved Convergence Rates and Practical Acceleration
  Strategies
Stochastic L-BFGS: Improved Convergence Rates and Practical Acceleration Strategies
Renbo Zhao
W. Haskell
Vincent Y. F. Tan
51
29
0
01 Apr 2017
IQN: An Incremental Quasi-Newton Method with Local Superlinear
  Convergence Rate
IQN: An Incremental Quasi-Newton Method with Local Superlinear Convergence Rate
Aryan Mokhtari
Mark Eisen
Alejandro Ribeiro
77
74
0
02 Feb 2017
Stochastic Quasi-Newton Methods for Nonconvex Stochastic Optimization
Stochastic Quasi-Newton Methods for Nonconvex Stochastic Optimization
Tianlin Li
Shiqian Ma
Shiqian Ma
Wen Liu
86
177
0
05 Jul 2016
Optimization Methods for Large-Scale Machine Learning
Optimization Methods for Large-Scale Machine Learning
Léon Bottou
Frank E. Curtis
J. Nocedal
252
3,226
0
15 Jun 2016
A Multi-Batch L-BFGS Method for Machine Learning
A Multi-Batch L-BFGS Method for Machine Learning
A. Berahas
J. Nocedal
Martin Takáč
ODL
79
112
0
19 May 2016
A Decentralized Quasi-Newton Method for Dual Formulations of Consensus
  Optimization
A Decentralized Quasi-Newton Method for Dual Formulations of Consensus Optimization
Mark Eisen
Aryan Mokhtari
Alejandro Ribeiro
52
14
0
23 Mar 2016
Stop Wasting My Gradients: Practical SVRG
Stop Wasting My Gradients: Practical SVRG
Reza Babanezhad
Mohamed Osama Ahmed
Alim Virani
Mark Schmidt
Jakub Konecný
Scott Sallinen
83
134
0
05 Nov 2015
A Linearly-Convergent Stochastic L-BFGS Algorithm
A Linearly-Convergent Stochastic L-BFGS Algorithm
Philipp Moritz
Robert Nishihara
Michael I. Jordan
ODL
82
235
0
09 Aug 2015
Asynchronous Parallel Stochastic Gradient for Nonconvex Optimization
Asynchronous Parallel Stochastic Gradient for Nonconvex Optimization
Xiangru Lian
Yijun Huang
Y. Li
Ji Liu
135
499
0
27 Jun 2015
On Variance Reduction in Stochastic Gradient Descent and its
  Asynchronous Variants
On Variance Reduction in Stochastic Gradient Descent and its Asynchronous Variants
Sashank J. Reddi
Ahmed S. Hefny
S. Sra
Barnabás Póczós
Alex Smola
126
196
0
23 Jun 2015
PASSCoDe: Parallel ASynchronous Stochastic dual Co-ordinate Descent
PASSCoDe: Parallel ASynchronous Stochastic dual Co-ordinate Descent
Cho-Jui Hsieh
Hsiang-Fu Yu
Inderjit S. Dhillon
78
108
0
06 Apr 2015
Adam: A Method for Stochastic Optimization
Adam: A Method for Stochastic Optimization
Diederik P. Kingma
Jimmy Ba
ODL
2.1K
150,364
0
22 Dec 2014
SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly
  Convex Composite Objectives
SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives
Aaron Defazio
Francis R. Bach
Simon Lacoste-Julien
ODL
137
1,828
0
01 Jul 2014
A Proximal Stochastic Gradient Method with Progressive Variance
  Reduction
A Proximal Stochastic Gradient Method with Progressive Variance Reduction
Lin Xiao
Tong Zhang
ODL
160
739
0
19 Mar 2014
A Stochastic Quasi-Newton Method for Large-Scale Optimization
A Stochastic Quasi-Newton Method for Large-Scale Optimization
R. Byrd
Samantha Hansen
J. Nocedal
Y. Singer
ODL
114
473
0
27 Jan 2014
Minimizing Finite Sums with the Stochastic Average Gradient
Minimizing Finite Sums with the Stochastic Average Gradient
Mark Schmidt
Nicolas Le Roux
Francis R. Bach
327
1,250
0
10 Sep 2013
HOGWILD!: A Lock-Free Approach to Parallelizing Stochastic Gradient
  Descent
HOGWILD!: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent
Feng Niu
Benjamin Recht
Christopher Ré
Stephen J. Wright
201
2,273
0
28 Jun 2011
1