ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2204.07261
  4. Cited By
Convergence and Implicit Regularization Properties of Gradient Descent
  for Deep Residual Networks

Convergence and Implicit Regularization Properties of Gradient Descent for Deep Residual Networks

14 April 2022
R. Cont
Alain Rossier
Renyuan Xu
    MLT
ArXivPDFHTML

Papers citing "Convergence and Implicit Regularization Properties of Gradient Descent for Deep Residual Networks"

4 / 4 papers shown
Title
Generalization of Scaled Deep ResNets in the Mean-Field Regime
Generalization of Scaled Deep ResNets in the Mean-Field Regime
Yihang Chen
Fanghui Liu
Yiping Lu
Grigorios G. Chrysos
V. Cevher
41
2
0
14 Mar 2024
Implicit regularization of deep residual networks towards neural ODEs
Implicit regularization of deep residual networks towards neural ODEs
P. Marion
Yu-Han Wu
Michael E. Sander
Gérard Biau
34
14
0
03 Sep 2023
Do Residual Neural Networks discretize Neural Ordinary Differential
  Equations?
Do Residual Neural Networks discretize Neural Ordinary Differential Equations?
Michael E. Sander
Pierre Ablin
Gabriel Peyré
32
25
0
29 May 2022
Representing smooth functions as compositions of near-identity functions
  with implications for deep network optimization
Representing smooth functions as compositions of near-identity functions with implications for deep network optimization
Peter L. Bartlett
S. Evans
Philip M. Long
73
31
0
13 Apr 2018
1