Stepsize anything: A unified learning rate schedule for budgeted-iteration training

The expanding computational costs and limited resources underscore the critical need for budgeted-iteration training, which aims to achieve optimal learning within predetermined iteration this http URL learning rate schedules fundamentally govern the performance of different networks and tasks, particularly in budgeted-iteration scenarios, their design remains largely heuristic, lacking theoretical this http URL addition, the optimal learning rate schedule requires extensive trial-and-error selection, making the training process this http URL this work, we propose the Unified Budget-Aware (UBA) schedule, a theoretically grounded learning rate schedule that consistently outperforms commonly-used schedules among diverse architectures and tasks under different constrained training this http URL, we bridge the gap by constructing a novel training budget-aware optimization framework, which explicitly accounts for the robustness to landscape curvature this http URL this framework, we derive the UBA schedule, controlled by a single hyper-parameter that provides a trade-off between flexibility and simplicity, eliminating the need for per-network numerical optimization. Moreover, we establish a theoretical connection between and the condition number, adding interpretation and justification to our approach. Besides, we prove the convergence for different values of .We offer practical guidelines for its selection via theoretical analysis and empirical this http URL experimental results show that UBA \textit{consistently surpasses} the commonly-used schedules across diverse vision and language tasks, spanning network architectures (e.g., ResNet, OLMo) and scales, under different training-iteration budgets.
View on arXiv@article{tang2025_2505.24452, title={ Stepsize anything: A unified learning rate schedule for budgeted-iteration training }, author={ Anda Tang and Yiming Dong and Yutao Zeng and zhou Xun and Zhouchen Lin }, journal={arXiv preprint arXiv:2505.24452}, year={ 2025 } }