ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1702.00709
  4. Cited By
IQN: An Incremental Quasi-Newton Method with Local Superlinear
  Convergence Rate

IQN: An Incremental Quasi-Newton Method with Local Superlinear Convergence Rate

2 February 2017
Aryan Mokhtari
Mark Eisen
Alejandro Ribeiro
ArXivPDFHTML

Papers citing "IQN: An Incremental Quasi-Newton Method with Local Superlinear Convergence Rate"

8 / 8 papers shown
Title
Incremental Gauss--Newton Methods with Superlinear Convergence Rates
Incremental Gauss--Newton Methods with Superlinear Convergence Rates
Zhiling Zhou
Zhuanghua Liu
Chengchang Liu
Luo Luo
39
0
0
03 Jul 2024
Second-order Information Promotes Mini-Batch Robustness in
  Variance-Reduced Gradients
Second-order Information Promotes Mini-Batch Robustness in Variance-Reduced Gradients
Sachin Garg
A. Berahas
Michal Dereziñski
46
1
0
23 Apr 2024
SPIRAL: A superlinearly convergent incremental proximal algorithm for
  nonconvex finite sum minimization
SPIRAL: A superlinearly convergent incremental proximal algorithm for nonconvex finite sum minimization
Pourya Behmandpoor
P. Latafat
Andreas Themelis
Marc Moonen
Panagiotis Patrinos
29
2
0
17 Jul 2022
SP2: A Second Order Stochastic Polyak Method
SP2: A Second Order Stochastic Polyak Method
Shuang Li
W. Swartworth
Martin Takávc
Deanna Needell
Robert Mansel Gower
26
13
0
17 Jul 2022
Stochastic Variance-Reduced Newton: Accelerating Finite-Sum Minimization with Large Batches
Stochastic Variance-Reduced Newton: Accelerating Finite-Sum Minimization with Large Batches
Michal Derezinski
55
5
0
06 Jun 2022
Non-asymptotic Superlinear Convergence of Standard Quasi-Newton Methods
Non-asymptotic Superlinear Convergence of Standard Quasi-Newton Methods
Qiujiang Jin
Aryan Mokhtari
21
38
0
30 Mar 2020
Flexible numerical optimization with ensmallen
Flexible numerical optimization with ensmallen
Ryan R. Curtin
Marcus Edel
Rahul Prabhu
S. Basak
Zhihao Lou
Conrad Sanderson
18
1
0
09 Mar 2020
Surpassing Gradient Descent Provably: A Cyclic Incremental Method with
  Linear Convergence Rate
Surpassing Gradient Descent Provably: A Cyclic Incremental Method with Linear Convergence Rate
Aryan Mokhtari
Mert Gurbuzbalaban
Alejandro Ribeiro
29
36
0
01 Nov 2016
1