48
0

PoLAR: Polar-Decomposed Low-Rank Adapter Representation

Main:9 Pages
10 Figures
Bibliography:7 Pages
9 Tables
Appendix:24 Pages
Abstract

We show that low-rank adaptation of large-scale models suffers from a low stable rank that is well below the linear algebraic rank of the subspace, degrading fine-tuning performance. To mitigate the underutilization of the allocated subspace, we propose PoLAR, a parameterization inspired by the polar decomposition that factorizes the low-rank update into two direction matrices constrained to Stiefel manifolds and an unconstrained scale matrix. Our theory shows that PoLAR yields an exponentially faster convergence rate on a canonical low-rank adaptation problem. Pairing the parameterization with Riemannian optimization leads to consistent gains on three different benchmarks testing general language understanding, commonsense reasoning, and mathematical problem solving with base model sizes ranging from 350M to 27B.

View on arXiv
@article{lion2025_2506.03133,
  title={ PoLAR: Polar-Decomposed Low-Rank Adapter Representation },
  author={ Kai Lion and Liang Zhang and Bingcong Li and Niao He },
  journal={arXiv preprint arXiv:2506.03133},
  year={ 2025 }
}
Comments on this paper