ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.11029
47
0

Neural Tangent Kernel of Neural Networks with Loss Informed by Differential Operators

14 March 2025
Weiye Gan
Yicheng Li
Q. Lin
Zuoqiang Shi
ArXivPDFHTML
Abstract

Spectral bias is a significant phenomenon in neural network training and can be explained by neural tangent kernel (NTK) theory. In this work, we develop the NTK theory for deep neural networks with physics-informed loss, providing insights into the convergence of NTK during initialization and training, and revealing its explicit structure. We find that, in most cases, the differential operators in the loss function do not induce a faster eigenvalue decay rate and stronger spectral bias. Some experimental results are also presented to verify the theory.

View on arXiv
@article{gan2025_2503.11029,
  title={ Neural Tangent Kernel of Neural Networks with Loss Informed by Differential Operators },
  author={ Weiye Gan and Yicheng Li and Qian Lin and Zuoqiang Shi },
  journal={arXiv preprint arXiv:2503.11029},
  year={ 2025 }
}
Comments on this paper