Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1311.0701
Cited By
On Fast Dropout and its Applicability to Recurrent Networks
4 November 2013
Justin Bayer
Christian Osendorfer
Daniela Korhammer
Nutan Chen
Sebastian Urban
Patrick van der Smagt
ODL
Re-assign community
ArXiv
PDF
HTML
Papers citing
"On Fast Dropout and its Applicability to Recurrent Networks"
9 / 9 papers shown
Title
How to Construct Deep Recurrent Neural Networks
Razvan Pascanu
Çağlar Gülçehre
Kyunghyun Cho
Yoshua Bengio
116
1,009
0
20 Dec 2013
Generating Sequences With Recurrent Neural Networks
Alex Graves
GAN
146
4,033
0
04 Aug 2013
Dropout Training as Adaptive Regularization
Stefan Wager
Sida I. Wang
Percy Liang
129
599
0
04 Jul 2013
Speech Recognition with Deep Recurrent Neural Networks
Alex Graves
Abdel-rahman Mohamed
Geoffrey E. Hinton
224
8,513
0
22 Mar 2013
Regularization and nonlinearities for neural language models: when are they needed?
Marius Pachitariu
M. Sahani
78
46
0
23 Jan 2013
High-dimensional sequence transduction
Nicolas Boulanger-Lewandowski
Yoshua Bengio
Pascal Vincent
AI4TS
182
69
0
09 Dec 2012
Advances in Optimizing Recurrent Networks
Yoshua Bengio
Nicolas Boulanger-Lewandowski
Razvan Pascanu
ODL
102
522
0
04 Dec 2012
On the difficulty of training Recurrent Neural Networks
Razvan Pascanu
Tomas Mikolov
Yoshua Bengio
ODL
190
5,342
0
21 Nov 2012
Improving neural networks by preventing co-adaptation of feature detectors
Geoffrey E. Hinton
Nitish Srivastava
A. Krizhevsky
Ilya Sutskever
Ruslan Salakhutdinov
VLM
447
7,661
0
03 Jul 2012
1