ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2307.06092
23
11

Quantitative CLTs in Deep Neural Networks

12 July 2023
Stefano Favaro
Boris Hanin
Domenico Marinucci
I. Nourdin
G. Peccati
    BDL
ArXivPDFHTML
Abstract

We study the distribution of a fully connected neural network with random Gaussian weights and biases in which the hidden layer widths are proportional to a large constant nnn. Under mild assumptions on the non-linearity, we obtain quantitative bounds on normal approximations valid at large but finite nnn and any fixed network depth. Our theorems show both for the finite-dimensional distributions and the entire process, that the distance between a random fully connected network (and its derivatives) to the corresponding infinite width Gaussian process scales like n−γn^{-\gamma}n−γ for γ>0\gamma>0γ>0, with the exponent depending on the metric used to measure discrepancy. Our bounds are strictly stronger in terms of their dependence on network width than any previously available in the literature; in the one-dimensional case, we also prove that they are optimal, i.e., we establish matching lower bounds.

View on arXiv
Comments on this paper