ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2105.14301
  4. Cited By
A Theory of Neural Tangent Kernel Alignment and Its Influence on
  Training

A Theory of Neural Tangent Kernel Alignment and Its Influence on Training

29 May 2021
H. Shan
Blake Bordelon
ArXivPDFHTML

Papers citing "A Theory of Neural Tangent Kernel Alignment and Its Influence on Training"

10 / 10 papers shown
Title
Generalization through variance: how noise shapes inductive biases in diffusion models
Generalization through variance: how noise shapes inductive biases in diffusion models
John J. Vastola
DiffM
364
3
0
16 Apr 2025
Connecting NTK and NNGP: A Unified Theoretical Framework for Wide Neural Network Learning Dynamics
Connecting NTK and NNGP: A Unified Theoretical Framework for Wide Neural Network Learning Dynamics
Yehonatan Avidan
Qianyi Li
H. Sompolinsky
77
8
0
08 Sep 2023
A Framework and Benchmark for Deep Batch Active Learning for Regression
A Framework and Benchmark for Deep Batch Active Learning for Regression
David Holzmüller
Viktor Zaverkin
Johannes Kastner
Ingo Steinwart
UQCV
BDL
GP
59
34
0
17 Mar 2022
Neural Networks as Kernel Learners: The Silent Alignment Effect
Neural Networks as Kernel Learners: The Silent Alignment Effect
Alexander B. Atanasov
Blake Bordelon
Cengiz Pehlevan
MLT
51
79
0
29 Oct 2021
The Principles of Deep Learning Theory
The Principles of Deep Learning Theory
Daniel A. Roberts
Sho Yaida
Boris Hanin
FaML
PINN
GNN
21
245
0
18 Jun 2021
Deep learning versus kernel learning: an empirical study of loss
  landscape geometry and the time evolution of the Neural Tangent Kernel
Deep learning versus kernel learning: an empirical study of loss landscape geometry and the time evolution of the Neural Tangent Kernel
Stanislav Fort
Gintare Karolina Dziugaite
Mansheej Paul
Sepideh Kharaghani
Daniel M. Roy
Surya Ganguli
82
187
0
28 Oct 2020
When Do Neural Networks Outperform Kernel Methods?
When Do Neural Networks Outperform Kernel Methods?
Behrooz Ghorbani
Song Mei
Theodor Misiakiewicz
Andrea Montanari
74
188
0
24 Jun 2020
On the asymptotics of wide networks with polynomial activations
On the asymptotics of wide networks with polynomial activations
Kyle Aitken
Guy Gur-Ari
14
23
0
11 Jun 2020
A Fine-Grained Spectral Perspective on Neural Networks
A Fine-Grained Spectral Perspective on Neural Networks
Greg Yang
Hadi Salman
55
112
0
24 Jul 2019
Exact solutions to the nonlinear dynamics of learning in deep linear
  neural networks
Exact solutions to the nonlinear dynamics of learning in deep linear neural networks
Andrew M. Saxe
James L. McClelland
Surya Ganguli
ODL
73
1,830
0
20 Dec 2013
1