ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.04430
74
0
v1v2 (latest)

Leveraging Coordinate Momentum in SignSGD and Muon: Memory-Optimized Zero-Order

4 June 2025
Egor Petrov
Grigoriy Evseev
Aleksey Antonov
Andrey Veprikov
Pavel Plyusnin
Nikolay Bushkov
Stanislav Moiseev
Aleksandr Beznosikov
ArXiv (abs)PDFHTML
Abstract

Fine-tuning Large Language Models (LLMs) is essential for adapting pre-trained models to downstream tasks. Yet traditional first-order optimizers such as Stochastic Gradient Descent (SGD) and Adam incur prohibitive memory and computational costs that scale poorly with model size. In this paper, we investigate zero-order (ZO) optimization methods as a memory- and compute-efficient alternative, particularly in the context of parameter-efficient fine-tuning techniques like LoRA. We propose JAGUAR SignSGD\texttt{JAGUAR SignSGD}JAGUAR SignSGD, a ZO momentum-based algorithm that extends ZO SignSGD, requiring the same number of parameters as the standard ZO SGD and only O(1)\mathcal{O}(1)O(1) function evaluations per iteration. To the best of our knowledge, this is the first study to establish rigorous convergence guarantees for SignSGD in the stochastic ZO case. We further propose JAGUAR Muon\texttt{JAGUAR Muon}JAGUAR Muon, a novel ZO extension of the Muon optimizer that leverages the matrix structure of model parameters, and we provide its convergence rate under arbitrary stochastic noise. Through extensive experiments on challenging LLM fine-tuning benchmarks, we demonstrate that the proposed algorithms meet or exceed the convergence quality of standard first-order methods, achieving significant memory reduction. Our theoretical and empirical results establish new ZO optimization methods as a practical and theoretically grounded approach for resource-constrained LLM adaptation. Our code is available at this https URL

View on arXiv
@article{petrov2025_2506.04430,
  title={ Leveraging Coordinate Momentum in SignSGD and Muon: Memory-Optimized Zero-Order },
  author={ Egor Petrov and Grigoriy Evseev and Aleksey Antonov and Andrey Veprikov and Pavel Plyusnin and Nikolay Bushkov and Stanislav Moiseev and Aleksandr Beznosikov },
  journal={arXiv preprint arXiv:2506.04430},
  year={ 2025 }
}
Comments on this paper