ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1412.4736
  4. Cited By
On the Inductive Bias of Dropout

On the Inductive Bias of Dropout

15 December 2014
D. Helmbold
Philip M. Long
ArXivPDFHTML

Papers citing "On the Inductive Bias of Dropout"

15 / 15 papers shown
Title
Modular Robot Control with Motor Primitives
Modular Robot Control with Motor Primitives
Moses C. Nah
Johannes Lachner
Neville Hogan
14
0
0
15 May 2025
Dropout Regularization in Extended Generalized Linear Models based on
  Double Exponential Families
Dropout Regularization in Extended Generalized Linear Models based on Double Exponential Families
Benedikt Lutke Schwienhorst
Lucas Kock
David J. Nott
Nadja Klein
24
1
0
11 May 2023
Information Geometry of Dropout Training
Information Geometry of Dropout Training
Masanari Kimura
H. Hino
14
2
0
22 Jun 2022
A Survey on Dropout Methods and Experimental Verification in
  Recommendation
A Survey on Dropout Methods and Experimental Verification in Recommendation
Yongqian Li
Weizhi Ma
C. L. Philip Chen
M. Zhang
Yiqun Liu
Shaoping Ma
Yue Yang
33
9
0
05 Apr 2022
Noise Regularizes Over-parameterized Rank One Matrix Recovery, Provably
Noise Regularizes Over-parameterized Rank One Matrix Recovery, Provably
Tianyi Liu
Yan Li
Enlu Zhou
Tuo Zhao
38
1
0
07 Feb 2022
Explicit Regularisation in Gaussian Noise Injections
Explicit Regularisation in Gaussian Noise Injections
A. Camuto
M. Willetts
Umut Simsekli
Stephen J. Roberts
Chris Holmes
23
55
0
14 Jul 2020
Dropout: Explicit Forms and Capacity Control
Dropout: Explicit Forms and Capacity Control
R. Arora
Peter L. Bartlett
Poorya Mianjy
Nathan Srebro
64
37
0
06 Mar 2020
Survey of Dropout Methods for Deep Neural Networks
Survey of Dropout Methods for Deep Neural Networks
Alex Labach
Hojjat Salehinejad
S. Valaee
24
149
0
25 Apr 2019
Implicit Regularization of Stochastic Gradient Descent in Natural
  Language Processing: Observations and Implications
Implicit Regularization of Stochastic Gradient Descent in Natural Language Processing: Observations and Implications
Deren Lei
Zichen Sun
Yijun Xiao
William Yang Wang
33
14
0
01 Nov 2018
An ETF view of Dropout regularization
An ETF view of Dropout regularization
Dor Bank
Raja Giryes
8
4
0
14 Oct 2018
On the Implicit Bias of Dropout
On the Implicit Bias of Dropout
Poorya Mianjy
R. Arora
René Vidal
27
66
0
26 Jun 2018
Formal Guarantees on the Robustness of a Classifier against Adversarial
  Manipulation
Formal Guarantees on the Robustness of a Classifier against Adversarial Manipulation
Matthias Hein
Maksym Andriushchenko
AAML
31
505
0
23 May 2017
Improved Dropout for Shallow and Deep Learning
Improved Dropout for Shallow and Deep Learning
Zhe Li
Boqing Gong
Tianbao Yang
BDL
SyDa
27
79
0
06 Feb 2016
A Scale Mixture Perspective of Multiplicative Noise in Neural Networks
A Scale Mixture Perspective of Multiplicative Noise in Neural Networks
Eric T. Nalisnick
Anima Anandkumar
Padhraic Smyth
27
19
0
10 Jun 2015
Improving neural networks by preventing co-adaptation of feature
  detectors
Improving neural networks by preventing co-adaptation of feature detectors
Geoffrey E. Hinton
Nitish Srivastava
A. Krizhevsky
Ilya Sutskever
Ruslan Salakhutdinov
VLM
266
7,636
0
03 Jul 2012
1