ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2202.00553
  4. Cited By
Neural Tangent Kernel Beyond the Infinite-Width Limit: Effects of Depth
  and Initialization

Neural Tangent Kernel Beyond the Infinite-Width Limit: Effects of Depth and Initialization

1 February 2022
Mariia Seleznova
Gitta Kutyniok
ArXivPDFHTML

Papers citing "Neural Tangent Kernel Beyond the Infinite-Width Limit: Effects of Depth and Initialization"

6 / 6 papers shown
Title
The Challenges of the Nonlinear Regime for Physics-Informed Neural
  Networks
The Challenges of the Nonlinear Regime for Physics-Informed Neural Networks
Andrea Bonfanti
Giuseppe Bruno
Cristina Cipriani
32
9
0
06 Feb 2024
Mind the spikes: Benign overfitting of kernels and neural networks in
  fixed dimension
Mind the spikes: Benign overfitting of kernels and neural networks in fixed dimension
Moritz Haas
David Holzmüller
U. V. Luxburg
Ingo Steinwart
MLT
42
14
0
23 May 2023
Efficient Parametric Approximations of Neural Network Function Space
  Distance
Efficient Parametric Approximations of Neural Network Function Space Distance
Nikita Dhawan
Sicong Huang
Juhan Bae
Roger C. Grosse
16
5
0
07 Feb 2023
Approximation results for Gradient Descent trained Shallow Neural
  Networks in $1d$
Approximation results for Gradient Descent trained Shallow Neural Networks in 1d1d1d
R. Gentile
G. Welper
ODL
66
6
0
17 Sep 2022
Deep Networks and the Multiple Manifold Problem
Deep Networks and the Multiple Manifold Problem
Sam Buchanan
D. Gilboa
John N. Wright
166
39
0
25 Aug 2020
Why bigger is not always better: on finite and infinite neural networks
Why bigger is not always better: on finite and infinite neural networks
Laurence Aitchison
175
51
0
17 Oct 2019
1