ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2205.13283
  4. Cited By
Embedding Principle in Depth for the Loss Landscape Analysis of Deep Neural Networks

Embedding Principle in Depth for the Loss Landscape Analysis of Deep Neural Networks

26 May 2022
Zhiwei Bai
Tao Luo
Z. Xu
Yaoyu Zhang
ArXivPDFHTML

Papers citing "Embedding Principle in Depth for the Loss Landscape Analysis of Deep Neural Networks"

6 / 6 papers shown
Title
Local Linear Recovery Guarantee of Deep Neural Networks at
  Overparameterization
Local Linear Recovery Guarantee of Deep Neural Networks at Overparameterization
Yaoyu Zhang
Leyang Zhang
Zhongwang Zhang
Zhiwei Bai
20
0
0
26 Jun 2024
Connectivity Shapes Implicit Regularization in Matrix Factorization Models for Matrix Completion
Connectivity Shapes Implicit Regularization in Matrix Factorization Models for Matrix Completion
Zhiwei Bai
Jiajie Zhao
Yaoyu Zhang
AI4CE
34
0
0
22 May 2024
Linear Stability Hypothesis and Rank Stratification for Nonlinear Models
Linear Stability Hypothesis and Rank Stratification for Nonlinear Models
Yaoyu Zhang
Zhongwang Zhang
Leyang Zhang
Zhiwei Bai
Tao Luo
Z. Xu
19
6
0
21 Nov 2022
Implicit regularization of dropout
Implicit regularization of dropout
Zhongwang Zhang
Zhi-Qin John Xu
19
21
0
13 Jul 2022
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
281
2,889
0
15 Sep 2016
Benefits of depth in neural networks
Benefits of depth in neural networks
Matus Telgarsky
142
602
0
14 Feb 2016
1