Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2208.05888
Cited By
Super-Universal Regularized Newton Method
11 August 2022
N. Doikov
Konstantin Mishchenko
Y. Nesterov
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Super-Universal Regularized Newton Method"
10 / 10 papers shown
Title
Second-Order Min-Max Optimization with Lazy Hessians
Lesi Chen
Chengchang Liu
Jingzhao Zhang
51
1
0
12 Oct 2024
Incremental Gauss--Newton Methods with Superlinear Convergence Rates
Zhiling Zhou
Zhuanghua Liu
Chengchang Liu
Luo Luo
39
0
0
03 Jul 2024
Stochastic Newton Proximal Extragradient Method
Ruichen Jiang
Michal Dereziñski
Aryan Mokhtari
39
0
0
03 Jun 2024
Adaptive proximal gradient methods are universal without approximation
Konstantinos A. Oikonomidis
Emanuel Laude
P. Latafat
Andreas Themelis
Panagiotis Patrinos
39
6
0
09 Feb 2024
SANIA: Polyak-type Optimization Framework Leads to Scale Invariant Stochastic Algorithms
Farshed Abdukhakimov
Chulu Xiang
Dmitry Kamzolov
Robert Mansel Gower
Martin Takáč
45
2
0
28 Dec 2023
First and zeroth-order implementations of the regularized Newton method with lazy approximated Hessians
N. Doikov
G. N. Grapiglia
30
4
0
05 Sep 2023
Minimizing Quasi-Self-Concordant Functions by Gradient Regularization of Newton Method
N. Doikov
30
6
0
28 Aug 2023
Unified Convergence Theory of Stochastic and Variance-Reduced Cubic Newton Methods
El Mahdi Chayti
N. Doikov
Martin Jaggi
ODL
29
6
0
23 Feb 2023
Second-order optimization with lazy Hessians
N. Doikov
El Mahdi Chayti
Martin Jaggi
38
16
0
01 Dec 2022
The First Optimal Acceleration of High-Order Methods in Smooth Convex Optimization
D. Kovalev
Alexander Gasnikov
51
29
0
19 May 2022
1