Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2402.01621
Cited By
Stochastic Two Points Method for Deep Model Zeroth-order Optimization
2 February 2024
Yijiang Pang
Jiayu Zhou
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Stochastic Two Points Method for Deep Model Zeroth-order Optimization"
5 / 5 papers shown
Title
Just One Byte (per gradient): A Note on Low-Bandwidth Decentralized Language Model Finetuning Using Shared Randomness
E. Zelikman
Qian Huang
Percy Liang
Nick Haber
Noah D. Goodman
70
14
0
16 Jun 2023
Convergence of Adam Under Relaxed Assumptions
Haochuan Li
Alexander Rakhlin
Ali Jadbabaie
37
55
0
27 Apr 2023
Noise Is Not the Main Factor Behind the Gap Between SGD and Adam on Transformers, but Sign Descent Might Be
Frederik Kunstner
Jacques Chen
J. Lavington
Mark W. Schmidt
40
67
0
27 Apr 2023
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
280
3,872
0
18 Apr 2021
Densely Connected Convolutional Networks
Gao Huang
Zhuang Liu
L. V. D. van der Maaten
Kilian Q. Weinberger
PINN
3DV
318
36,420
0
25 Aug 2016
1