ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1307.1493
  4. Cited By
Dropout Training as Adaptive Regularization

Dropout Training as Adaptive Regularization

4 July 2013
Stefan Wager
Sida I. Wang
Percy Liang
ArXivPDFHTML

Papers citing "Dropout Training as Adaptive Regularization"

6 / 6 papers shown
Title
Random Forest Autoencoders for Guided Representation Learning
Random Forest Autoencoders for Guided Representation Learning
Adrien Aumon
Shuang Ni
Myriam Lizotte
Guy Wolf
Kevin R. Moon
Jake S. Rhodes
101
0
0
18 Feb 2025
Beyond Self-Consistency: Loss-Balanced Perturbation-Based Regularization Improves Industrial-Scale Ads Ranking
Beyond Self-Consistency: Loss-Balanced Perturbation-Based Regularization Improves Industrial-Scale Ads Ranking
Ilqar Ramazanli
Hamid Eghbalzadeh
Xiaoyi Liu
Yang Wang
Jiaxiang Fu
Kaushik Rangadurai
Sem Park
Bo Long
Xue Feng
95
0
0
05 Feb 2025
A Review of Bayesian Uncertainty Quantification in Deep Probabilistic Image Segmentation
A Review of Bayesian Uncertainty Quantification in Deep Probabilistic Image Segmentation
M. Valiuddin
R. V. Sloun
C.G.A. Viviers
Peter H. N. de With
Fons van der Sommen
UQCV
213
1
0
25 Nov 2024
Maxout Networks
Maxout Networks
Ian Goodfellow
David Warde-Farley
M. Berk Mirza
Aaron Courville
Yoshua Bengio
OOD
210
2,177
0
18 Feb 2013
Improving neural networks by preventing co-adaptation of feature
  detectors
Improving neural networks by preventing co-adaptation of feature detectors
Geoffrey E. Hinton
Nitish Srivastava
A. Krizhevsky
Ilya Sutskever
Ruslan Salakhutdinov
VLM
414
7,657
0
03 Jul 2012
Adding noise to the input of a model trained with a regularized
  objective
Adding noise to the input of a model trained with a regularized objective
Salah Rifai
Xavier Glorot
Yoshua Bengio
Pascal Vincent
NoLa
77
78
0
16 Apr 2011
1