ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2106.07769
  4. Cited By
The Flip Side of the Reweighted Coin: Duality of Adaptive Dropout and
  Regularization

The Flip Side of the Reweighted Coin: Duality of Adaptive Dropout and Regularization

14 June 2021
Daniel LeJeune
Hamid Javadi
Richard G. Baraniuk
ArXivPDFHTML

Papers citing "The Flip Side of the Reweighted Coin: Duality of Adaptive Dropout and Regularization"

5 / 5 papers shown
Title
On the Convergence of Shallow Neural Network Training with Randomly
  Masked Neurons
On the Convergence of Shallow Neural Network Training with Randomly Masked Neurons
Fangshuo Liao
Anastasios Kyrillidis
36
16
0
05 Dec 2021
Prune Your Model Before Distill It
Prune Your Model Before Distill It
Jinhyuk Park
Albert No
VLM
40
27
0
30 Sep 2021
Dropout: Explicit Forms and Capacity Control
Dropout: Explicit Forms and Capacity Control
R. Arora
Peter L. Bartlett
Poorya Mianjy
Nathan Srebro
64
37
0
06 Mar 2020
Dropout as a Bayesian Approximation: Representing Model Uncertainty in
  Deep Learning
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Y. Gal
Zoubin Ghahramani
UQCV
BDL
285
9,136
0
06 Jun 2015
SLOPE is Adaptive to Unknown Sparsity and Asymptotically Minimax
SLOPE is Adaptive to Unknown Sparsity and Asymptotically Minimax
Weijie Su
Emmanuel Candes
65
145
0
29 Mar 2015
1