ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.03010
27
14

Spectral alignment of stochastic gradient descent for high-dimensional classification tasks

4 October 2023
Gerard Ben Arous
Reza Gheissari
Jiaoyang Huang
Aukosh Jagannath
ArXivPDFHTML
Abstract

We rigorously study the relation between the training dynamics via stochastic gradient descent (SGD) and the spectra of empirical Hessian and gradient matrices. We prove that in two canonical classification tasks for multi-class high-dimensional mixtures and either 1 or 2-layer neural networks, both the SGD trajectory and emergent outlier eigenspaces of the Hessian and gradient matrices align with a common low-dimensional subspace. Moreover, in multi-layer settings this alignment occurs per layer, with the final layer's outlier eigenspace evolving over the course of training, and exhibiting rank deficiency when the SGD converges to sub-optimal classifiers. This establishes some of the rich predictions that have arisen from extensive numerical studies in the last decade about the spectra of Hessian and information matrices over the course of training in overparametrized networks.

View on arXiv
@article{arous2025_2310.03010,
  title={ Spectral alignment of stochastic gradient descent for high-dimensional classification tasks },
  author={ Gerard Ben Arous and Reza Gheissari and Jiaoyang Huang and Aukosh Jagannath },
  journal={arXiv preprint arXiv:2310.03010},
  year={ 2025 }
}
Comments on this paper