ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2207.00751
  4. Cited By
Informed Learning by Wide Neural Networks: Convergence, Generalization
  and Sampling Complexity

Informed Learning by Wide Neural Networks: Convergence, Generalization and Sampling Complexity

2 July 2022
Jianyi Yang
Shaolei Ren
ArXivPDFHTML

Papers citing "Informed Learning by Wide Neural Networks: Convergence, Generalization and Sampling Complexity"

3 / 3 papers shown
Title
Knowledge Distillation in Wide Neural Networks: Risk Bound, Data
  Efficiency and Imperfect Teacher
Knowledge Distillation in Wide Neural Networks: Risk Bound, Data Efficiency and Imperfect Teacher
Guangda Ji
Zhanxing Zhu
51
42
0
20 Oct 2020
Fine-Grained Visual Categorization using Meta-Learning Optimization with
  Sample Selection of Auxiliary Data
Fine-Grained Visual Categorization using Meta-Learning Optimization with Sample Selection of Auxiliary Data
Yabin Zhang
Hui Tang
K. Jia
55
94
0
28 Jul 2018
Interaction Networks for Learning about Objects, Relations and Physics
Interaction Networks for Learning about Objects, Relations and Physics
Peter W. Battaglia
Razvan Pascanu
Matthew Lai
Danilo Jimenez Rezende
Koray Kavukcuoglu
AI4CE
OCL
PINN
GNN
278
1,400
0
01 Dec 2016
1