Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2411.13999
Cited By
Accelerated zero-order SGD under high-order smoothness and overparameterized regime
21 November 2024
Georgii Bychkov
D. Dvinskikh
Anastasia Antsiferova
Alexander Gasnikov
Aleksandr Lobanov
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Accelerated zero-order SGD under high-order smoothness and overparameterized regime"
9 / 9 papers shown
Title
KerZOO: Kernel Function Informed Zeroth-Order Optimization for Accurate and Accelerated LLM Fine-Tuning
Zhendong Mi
Qitao Tan
Xiaodong Yu
Zining Zhu
Geng Yuan
Shaoyi Huang
202
0
0
24 May 2025
A gradient estimator via L1-randomization for online zero-order optimization with two point feedback
A. Akhavan
Evgenii Chzhen
Massimiliano Pontil
Alexandre B. Tsybakov
114
20
0
27 May 2022
Training Compute-Optimal Large Language Models
Jordan Hoffmann
Sebastian Borgeaud
A. Mensch
Elena Buchatskaya
Trevor Cai
...
Karen Simonyan
Erich Elsen
Jack W. Rae
Oriol Vinyals
Laurent Sifre
AI4TS
208
1,987
0
29 Mar 2022
Gradient-Free Methods for Saddle-Point Problem
Aleksandr Beznosikov
Abdurakhmon Sadiev
Alexander Gasnikov
75
25
0
12 May 2020
Reconciling modern machine learning practice and the bias-variance trade-off
M. Belkin
Daniel J. Hsu
Siyuan Ma
Soumik Mandal
247
1,659
0
28 Dec 2018
Learning and Generalization in Overparameterized Neural Networks, Going Beyond Two Layers
Zeyuan Allen-Zhu
Yuanzhi Li
Yingyu Liang
MLT
201
775
0
12 Nov 2018
Highly-Smooth Zero-th Order Online Optimization Vianney Perchet
Francis R. Bach
Vianney Perchet
154
86
0
26 May 2016
An Optimal Algorithm for Bandit and Zero-Order Convex Optimization with Two-Point Feedback
Ohad Shamir
62
262
0
31 Jul 2015
Optimal rates for zero-order convex optimization: the power of two function evaluations
John C. Duchi
Michael I. Jordan
Martin J. Wainwright
Andre Wibisono
95
489
0
07 Dec 2013
1