ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.00342
  4. Cited By
Combining Explicit and Implicit Regularization for Efficient Learning in
  Deep Networks

Combining Explicit and Implicit Regularization for Efficient Learning in Deep Networks

1 June 2023
Dan Zhao
ArXivPDFHTML

Papers citing "Combining Explicit and Implicit Regularization for Efficient Learning in Deep Networks"

5 / 5 papers shown
Title
Pretraining with Random Noise for Fast and Robust Learning without Weight Transport
Pretraining with Random Noise for Fast and Robust Learning without Weight Transport
Jeonghwan Cheon
Sang Wan Lee
Se-Bum Paik
OOD
281
2
0
27 May 2024
Efficient Compression of Overparameterized Deep Models through
  Low-Dimensional Learning Dynamics
Efficient Compression of Overparameterized Deep Models through Low-Dimensional Learning Dynamics
Soo Min Kwon
Zekai Zhang
Dogyoon Song
Laura Balzano
Qing Qu
63
3
0
08 Nov 2023
Stretched and measured neural predictions of complex network dynamics
Stretched and measured neural predictions of complex network dynamics
V. Vasiliauskaite
Nino Antulov-Fantulin
50
1
0
12 Jan 2023
Understanding Gradient Descent on Edge of Stability in Deep Learning
Understanding Gradient Descent on Edge of Stability in Deep Learning
Sanjeev Arora
Zhiyuan Li
A. Panigrahi
MLT
83
91
0
19 May 2022
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
318
2,904
0
15 Sep 2016
1