107
0

Spectral-factorized Positive-definite Curvature Learning for NN Training

Abstract

Many training methods, such as Adam(W) and Shampoo, learn a positive-definite curvature matrix and apply an inverse root before preconditioning. Recently, non-diagonal training methods, such as Shampoo, have gained significant attention; however, they remain computationally inefficient and are limited to specific types of curvature information due to the costly matrix root computation via matrix decomposition. To address this, we propose a Riemannian optimization approach that dynamically adapts spectral-factorized positive-definite curvature estimates, enabling the efficient application of arbitrary matrix roots and generic curvature learning. We demonstrate the efficacy and versatility of our approach in positive-definite matrix optimization and covariance adaptation for gradient-free optimization, as well as its efficiency in curvature learning for neural net training.

View on arXiv
@article{lin2025_2502.06268,
  title={ Spectral-factorized Positive-definite Curvature Learning for NN Training },
  author={ Wu Lin and Felix Dangel and Runa Eschenhagen and Juhan Bae and Richard E. Turner and Roger B. Grosse },
  journal={arXiv preprint arXiv:2502.06268},
  year={ 2025 }
}
Comments on this paper

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. See our policy.