ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2112.07572
15
37

The high-dimensional asymptotics of first order methods with random data

14 December 2021
Michael Celentano
Chen Cheng
Andrea Montanari
    AI4CE
ArXivPDFHTML
Abstract

We study a class of deterministic flows in Rd×k{\mathbb R}^{d\times k}Rd×k, parametrized by a random matrix X∈Rn×d{\boldsymbol X}\in {\mathbb R}^{n\times d}X∈Rn×d with i.i.d. centered subgaussian entries. We characterize the asymptotic behavior of these flows over bounded time horizons, in the high-dimensional limit in which n,d→∞n,d\to\inftyn,d→∞ with kkk fixed and converging aspect ratios n/d→δn/d\to\deltan/d→δ. The asymptotic characterization we prove is in terms of a system of a nonlinear stochastic process in kkk dimensions, whose parameters are determined by a fixed point condition. This type of characterization is known in physics as dynamical mean field theory. Rigorous results of this type have been obtained in the past for a few spin glass models. Our proof is based on time discretization and a reduction to certain iterative schemes known as approximate message passing (AMP) algorithms, as opposed to earlier work that was based on large deviations theory and stochastic processes theory. The new approach allows for a more elementary proof and implies that the high-dimensional behavior of the flow is universal with respect to the distribution of the entries of X{\boldsymbol X}X. As specific applications, we obtain high-dimensional characterizations of gradient flow in some classical models from statistics and machine learning, under a random design assumption.

View on arXiv
Comments on this paper