ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2102.10307
20
18

Large-width functional asymptotics for deep Gaussian neural networks

20 February 2021
Daniele Bracale
Stefano Favaro
S. Fortini
Stefano Peluchetti
ArXivPDFHTML
Abstract

In this paper, we consider fully connected feed-forward deep neural networks where weights and biases are independent and identically distributed according to Gaussian distributions. Extending previous results (Matthews et al., 2018a;b; Yang, 2019) we adopt a function-space perspective, i.e. we look at neural networks as infinite-dimensional random elements on the input space RI\mathbb{R}^IRI. Under suitable assumptions on the activation function we show that: i) a network defines a continuous Gaussian process on the input space RI\mathbb{R}^IRI; ii) a network with re-scaled weights converges weakly to a continuous Gaussian process in the large-width limit; iii) the limiting Gaussian process has almost surely locally γ\gammaγ-H\"older continuous paths, for 0<γ<10 < \gamma <10<γ<1. Our results contribute to recent theoretical studies on the interplay between infinitely wide deep neural networks and Gaussian processes by establishing weak convergence in function-space with respect to a stronger metric.

View on arXiv
Comments on this paper