ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.04477
  4. Cited By
Analyzing Finite Neural Networks: Can We Trust Neural Tangent Kernel
  Theory?

Analyzing Finite Neural Networks: Can We Trust Neural Tangent Kernel Theory?

8 December 2020
Mariia Seleznova
Gitta Kutyniok
    AAML
ArXivPDFHTML

Papers citing "Analyzing Finite Neural Networks: Can We Trust Neural Tangent Kernel Theory?"

6 / 6 papers shown
Title
Issues with Neural Tangent Kernel Approach to Neural Networks
Issues with Neural Tangent Kernel Approach to Neural Networks
Haoran Liu
Anthony S. Tai
David J. Crandall
Chunfeng Huang
42
0
0
19 Jan 2025
The Challenges of the Nonlinear Regime for Physics-Informed Neural
  Networks
The Challenges of the Nonlinear Regime for Physics-Informed Neural Networks
Andrea Bonfanti
Giuseppe Bruno
Cristina Cipriani
32
7
0
06 Feb 2024
Width and Depth Limits Commute in Residual Networks
Width and Depth Limits Commute in Residual Networks
Soufiane Hayou
Greg Yang
42
14
0
01 Feb 2023
Joint Embedding Self-Supervised Learning in the Kernel Regime
Joint Embedding Self-Supervised Learning in the Kernel Regime
B. Kiani
Randall Balestriero
Yubei Chen
S. Lloyd
Yann LeCun
SSL
43
13
0
29 Sep 2022
Gaussian Pre-Activations in Neural Networks: Myth or Reality?
Gaussian Pre-Activations in Neural Networks: Myth or Reality?
Pierre Wolinski
Julyan Arbel
AI4CE
70
8
0
24 May 2022
The Future is Log-Gaussian: ResNets and Their Infinite-Depth-and-Width
  Limit at Initialization
The Future is Log-Gaussian: ResNets and Their Infinite-Depth-and-Width Limit at Initialization
Mufan Bill Li
Mihai Nica
Daniel M. Roy
28
33
0
07 Jun 2021
1