47

HTMuon: Improving Muon via Heavy-Tailed Spectral Correction

Tianyu Pang
Yujie Fang
Zihang Liu
Shenyang Deng
Lei Hsiung
Shuhua Yu
Yaoqing Yang
Main:14 Pages
12 Figures
Bibliography:5 Pages
22 Tables
Appendix:18 Pages
Abstract

Muon has recently shown promising results in LLM training. In this work, we study how to further improve Muon. We argue that Muon's orthogonalized update rule suppresses the emergence of heavy-tailed weight spectra and over-emphasizes the training along noise-dominated directions. Motivated by the Heavy-Tailed Self-Regularization (HT-SR) theory, we propose HTMuon. HTMuon preserves Muon's ability to capture parameter interdependencies while producing heavier-tailed updates and inducing heavier-tailed weight spectra. Experiments on LLM pretraining and image classification show that HTMuon consistently improves performance over state-of-the-art baselines and can also serve as a plug-in on top of existing Muon variants. For example, on LLaMA pretraining on the C4 dataset, HTMuon reduces perplexity by up to 0.980.98 compared to Muon. We further theoretically show that HTMuon corresponds to steepest descent under the Schatten-qq norm constraint and provide convergence analysis in smooth non-convex settings. The implementation of HTMuon is available atthis https URL.

View on arXiv
Comments on this paper