ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1910.04920
  4. Cited By
Fast and Furious Convergence: Stochastic Second Order Methods under
  Interpolation

Fast and Furious Convergence: Stochastic Second Order Methods under Interpolation

11 October 2019
S. Meng
Sharan Vaswani
I. Laradji
Mark W. Schmidt
Simon Lacoste-Julien
ArXivPDFHTML

Papers citing "Fast and Furious Convergence: Stochastic Second Order Methods under Interpolation"

9 / 9 papers shown
Title
Efficient Curvature-Aware Hypergradient Approximation for Bilevel Optimization
Efficient Curvature-Aware Hypergradient Approximation for Bilevel Optimization
Youran Dong
Junfeng Yang
Wei-Ting Yao
Jin Zhang
130
0
0
04 May 2025
SAPPHIRE: Preconditioned Stochastic Variance Reduction for Faster Large-Scale Statistical Learning
Jingruo Sun
Zachary Frangella
Madeleine Udell
33
0
0
28 Jan 2025
Faster Convergence of Stochastic Accelerated Gradient Descent under Interpolation
Faster Convergence of Stochastic Accelerated Gradient Descent under Interpolation
Aaron Mishkin
Mert Pilanci
Mark Schmidt
62
1
0
03 Apr 2024
Critical Influence of Overparameterization on Sharpness-aware Minimization
Critical Influence of Overparameterization on Sharpness-aware Minimization
Sungbin Shin
Dongyeop Lee
Maksym Andriushchenko
Namhoon Lee
AAML
44
1
0
29 Nov 2023
On the Convergence to a Global Solution of Shuffling-Type Gradient
  Algorithms
On the Convergence to a Global Solution of Shuffling-Type Gradient Algorithms
Lam M. Nguyen
Trang H. Tran
32
2
0
13 Jun 2022
Hessian Averaging in Stochastic Newton Methods Achieves Superlinear
  Convergence
Hessian Averaging in Stochastic Newton Methods Achieves Superlinear Convergence
Sen Na
Michal Derezinski
Michael W. Mahoney
27
16
0
20 Apr 2022
SCORE: Approximating Curvature Information under Self-Concordant
  Regularization
SCORE: Approximating Curvature Information under Self-Concordant Regularization
Adeyemi Damilare Adeoye
Alberto Bemporad
20
4
0
14 Dec 2021
SVRG Meets AdaGrad: Painless Variance Reduction
SVRG Meets AdaGrad: Painless Variance Reduction
Benjamin Dubois-Taine
Sharan Vaswani
Reza Babanezhad
Mark W. Schmidt
Simon Lacoste-Julien
18
17
0
18 Feb 2021
Linear Convergence of Gradient and Proximal-Gradient Methods Under the
  Polyak-Łojasiewicz Condition
Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Łojasiewicz Condition
Hamed Karimi
J. Nutini
Mark W. Schmidt
139
1,199
0
16 Aug 2016
1