ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2003.03397
  4. Cited By
Dropout: Explicit Forms and Capacity Control

Dropout: Explicit Forms and Capacity Control

6 March 2020
R. Arora
Peter L. Bartlett
Poorya Mianjy
Nathan Srebro
ArXivPDFHTML

Papers citing "Dropout: Explicit Forms and Capacity Control"

13 / 13 papers shown
Title
Analytic theory of dropout regularization
Analytic theory of dropout regularization
Francesco Mori
Francesca Mignacco
24
0
0
12 May 2025
Hadamard product in deep learning: Introduction, Advances and Challenges
Hadamard product in deep learning: Introduction, Advances and Challenges
Grigorios G. Chrysos
Yongtao Wu
Razvan Pascanu
Philip Torr
V. Cevher
AAML
98
0
0
17 Apr 2025
UMIX: Improving Importance Weighting for Subpopulation Shift via
  Uncertainty-Aware Mixup
UMIX: Improving Importance Weighting for Subpopulation Shift via Uncertainty-Aware Mixup
Zongbo Han
Zhipeng Liang
Fan Yang
Liu Liu
Lanqing Li
Yatao Bian
P. Zhao
Bing Wu
Changqing Zhang
Jianhua Yao
50
34
0
19 Sep 2022
A Unified Analysis of Mixed Sample Data Augmentation: A Loss Function
  Perspective
A Unified Analysis of Mixed Sample Data Augmentation: A Loss Function Perspective
Chanwoo Park
Sangdoo Yun
Sanghyuk Chun
AAML
16
32
0
21 Aug 2022
Gating Dropout: Communication-efficient Regularization for Sparsely
  Activated Transformers
Gating Dropout: Communication-efficient Regularization for Sparsely Activated Transformers
R. Liu
Young Jin Kim
Alexandre Muzio
Hany Awadalla
MoE
42
22
0
28 May 2022
Towards Size-Independent Generalization Bounds for Deep Operator Nets
Towards Size-Independent Generalization Bounds for Deep Operator Nets
Pulkit Gopalani
Sayar Karmakar
Dibyakanti Kumar
Anirbit Mukherjee
AI4CE
24
5
0
23 May 2022
Stochastic Neural Networks with Infinite Width are Deterministic
Stochastic Neural Networks with Infinite Width are Deterministic
Liu Ziyin
Hanlin Zhang
Xiangming Meng
Yuting Lu
Eric P. Xing
Masakuni Ueda
21
3
0
30 Jan 2022
Explicit Regularisation in Gaussian Noise Injections
Explicit Regularisation in Gaussian Noise Injections
A. Camuto
M. Willetts
Umut Simsekli
Stephen J. Roberts
Chris Holmes
15
55
0
14 Jul 2020
Shape Matters: Understanding the Implicit Bias of the Noise Covariance
Shape Matters: Understanding the Implicit Bias of the Noise Covariance
Jeff Z. HaoChen
Colin Wei
J. Lee
Tengyu Ma
18
93
0
15 Jun 2020
Implicit Regularization in Deep Learning May Not Be Explainable by Norms
Implicit Regularization in Deep Learning May Not Be Explainable by Norms
Noam Razin
Nadav Cohen
16
155
0
13 May 2020
Dropout as a Bayesian Approximation: Representing Model Uncertainty in
  Deep Learning
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Y. Gal
Zoubin Ghahramani
UQCV
BDL
282
9,136
0
06 Jun 2015
Norm-Based Capacity Control in Neural Networks
Norm-Based Capacity Control in Neural Networks
Behnam Neyshabur
Ryota Tomioka
Nathan Srebro
116
577
0
27 Feb 2015
Improving neural networks by preventing co-adaptation of feature
  detectors
Improving neural networks by preventing co-adaptation of feature detectors
Geoffrey E. Hinton
Nitish Srivastava
A. Krizhevsky
Ilya Sutskever
Ruslan Salakhutdinov
VLM
266
7,634
0
03 Jul 2012
1