ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.23642
37
0

Learning a Single Index Model from Anisotropic Data with vanilla Stochastic Gradient Descent

31 March 2025
Guillaume Braun
Minh Ha Quang
Masaaki Imaizumi
    MLT
ArXivPDFHTML
Abstract

We investigate the problem of learning a Single Index Model (SIM)- a popular model for studying the ability of neural networks to learn features - from anisotropic Gaussian inputs by training a neuron using vanilla Stochastic Gradient Descent (SGD). While the isotropic case has been extensively studied, the anisotropic case has received less attention and the impact of the covariance matrix on the learning dynamics remains unclear. For instance, Mousavi-Hosseini et al. (2023b) proposed a spherical SGD that requires a separate estimation of the data covariance matrix, thereby oversimplifying the influence of covariance. In this study, we analyze the learning dynamics of vanilla SGD under the SIM with anisotropic input data, demonstrating that vanilla SGD automatically adapts to the data's covariance structure. Leveraging these results, we derive upper and lower bounds on the sample complexity using a notion of effective dimension that is determined by the structure of the covariance matrix instead of the input data dimension.

View on arXiv
@article{braun2025_2503.23642,
  title={ Learning a Single Index Model from Anisotropic Data with vanilla Stochastic Gradient Descent },
  author={ Guillaume Braun and Minh Ha Quang and Masaaki Imaizumi },
  journal={arXiv preprint arXiv:2503.23642},
  year={ 2025 }
}
Comments on this paper