Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1611.09913
Cited By
Capacity and Trainability in Recurrent Neural Networks
29 November 2016
Jasmine Collins
Jascha Narain Sohl-Dickstein
David Sussillo
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Capacity and Trainability in Recurrent Neural Networks"
45 / 95 papers shown
Title
Deep Independently Recurrent Neural Network (IndRNN)
Shuai Li
Wanqing Li
Chris Cook
Yanbo Gao
21
50
0
11 Oct 2019
The Ant Swarm Neuro-Evolution Procedure for Optimizing Recurrent Networks
A. ElSaid
Alexander Ororbia
Travis J. Desell
ODL
8
10
0
26 Sep 2019
An Empirical Exploration of Deep Recurrent Connections and Memory Cells Using Neuro-Evolution
Travis J. Desell
A. ElSaid
Alexander Ororbia
13
1
0
20 Sep 2019
Recurrent Neural Networks for Time Series Forecasting: Current Status and Future Directions
Hansika Hewamalage
Christoph Bergmeir
Kasun Bandara
AI4TS
11
872
0
02 Sep 2019
RNNs Evolving on an Equilibrium Manifold: A Panacea for Vanishing and Exploding Gradients?
Anil Kag
Ziming Zhang
Venkatesh Saligrama
16
8
0
22 Aug 2019
Deep Temporal Analysis for Non-Acted Body Affect Recognition
D. Avola
Luigi Cinque
Alessio Fagioli
G. Foresti
Cristiano Massaroni
CVBM
28
27
0
23 Jul 2019
Universality and individuality in neural dynamics across large populations of recurrent networks
Niru Maheswaranathan
Alex H. Williams
Matthew D. Golub
Surya Ganguli
David Sussillo
24
140
0
19 Jul 2019
Reverse engineering recurrent networks for sentiment classification reveals line attractor dynamics
Niru Maheswaranathan
Alex H. Williams
Matthew D. Golub
Surya Ganguli
David Sussillo
26
77
0
25 Jun 2019
Multigrid Neural Memory
T. Huynh
Michael Maire
Matthew R. Walter
11
9
0
13 Jun 2019
On Network Design Spaces for Visual Recognition
Ilija Radosavovic
Justin Johnson
Saining Xie
Wan-Yen Lo
Piotr Dollár
17
134
0
30 May 2019
Clustering and Recognition of Spatiotemporal Features through Interpretable Embedding of Sequence to Sequence Recurrent Neural Networks
Kun Su
Eli Shlizerman
28
13
0
29 May 2019
Learning Longer-term Dependencies via Grouped Distributor Unit
Wei Luo
Feng Yu
16
2
0
29 Apr 2019
Contextual Hybrid Session-based News Recommendation with Recurrent Neural Networks
Gabriel de Souza P. Moreira
Dietmar Jannach
A. Cunha
14
87
0
15 Apr 2019
The importance of better models in stochastic optimization
Hilal Asi
John C. Duchi
6
71
0
20 Mar 2019
Understanding Feature Selection and Feature Memorization in Recurrent Neural Networks
Bokang Zhu
Richong Zhang
Dingkun Long
Yongyi Mao
16
0
0
03 Mar 2019
Equilibrated Recurrent Neural Network: Neuronal Time-Delayed Self-Feedback Improves Accuracy and Stability
Ziming Zhang
Anil Kag
Alan Sullivan
Venkatesh Saligrama
23
5
0
02 Mar 2019
AntisymmetricRNN: A Dynamical System View on Recurrent Neural Networks
B. Chang
Minmin Chen
E. Haber
Ed H. Chi
PINN
GNN
23
197
0
26 Feb 2019
An Optimized Recurrent Unit for Ultra-Low-Power Keyword Spotting
Justice Amoh
K. Odame
24
17
0
13 Feb 2019
Investigating Recurrent Neural Network Memory Structures using Neuro-Evolution
Alexander Ororbia
A. ElSaid
Travis J. Desell
19
54
0
06 Feb 2019
FastGRNN: A Fast, Accurate, Stable and Tiny Kilobyte Sized Gated Recurrent Neural Network
Aditya Kusupati
Manish Singh
Kush S. Bhatia
A. Kumar
Prateek Jain
Manik Varma
16
189
0
08 Jan 2019
AdaFrame: Adaptive Frame Selection for Fast Video Recognition
Zuxuan Wu
Caiming Xiong
Chih-Yao Ma
R. Socher
L. Davis
125
194
0
29 Nov 2018
Evaluating the Ability of LSTMs to Learn Context-Free Grammars
Luzi Sennhauser
Robert C. Berwick
14
57
0
06 Nov 2018
Persistence pays off: Paying Attention to What the LSTM Gating Mechanism Persists
Giancarlo D. Salton
John D. Kelleher
KELM
RALM
18
6
0
10 Oct 2018
Deep, Skinny Neural Networks are not Universal Approximators
Jesse Johnson
11
65
0
30 Sep 2018
Single-Microphone Speech Enhancement and Separation Using Deep Learning
Morten Kolbaek
12
7
0
31 Aug 2018
Rational Recurrences
Hao Peng
Roy Schwartz
Sam Thomson
Noah A. Smith
AI4CE
12
39
0
28 Aug 2018
On Training Recurrent Networks with Truncated Backpropagation Through Time in Speech Recognition
Hao Tang
James R. Glass
13
19
0
09 Jul 2018
Task-Driven Convolutional Recurrent Models of the Visual System
Aran Nayebi
Daniel M. Bear
J. Kubilius
Kohitij Kar
Surya Ganguli
David Sussillo
J. DiCarlo
Daniel L. K. Yamins
12
150
0
20 Jun 2018
Towards an efficient deep learning model for musical onset detection
Rong Gong
Xavier Serra
22
6
0
18 Jun 2018
Dynamical Isometry and a Mean Field Theory of RNNs: Gating Enables Signal Propagation in Recurrent Neural Networks
Minmin Chen
Jeffrey Pennington
S. Schoenholz
SyDa
AI4CE
14
114
0
14 Jun 2018
A Taxonomy for Neural Memory Networks
Ying Ma
José C. Príncipe
22
21
0
01 May 2018
A Dataset and Architecture for Visual Reasoning with a Working Memory
G. R. Yang
Igor Ganichev
Xiao-Jing Wang
Jonathon Shlens
David Sussillo
14
54
0
16 Mar 2018
Efficient Neural Architecture Search via Parameter Sharing
Hieu H. Pham
M. Guan
Barret Zoph
Quoc V. Le
J. Dean
21
2,745
0
09 Feb 2018
Overcoming the vanishing gradient problem in plain recurrent networks
Yuhuang Hu
Adrian E. G. Huber
Jithendar Anumula
Shih-Chii Liu
GNN
32
103
0
18 Jan 2018
Recurrent Neural Networks for Semantic Instance Segmentation
Amaia Salvador
Míriam Bellver
Victor Campos
Manel Baradad
F. Marqués
Jordi Torres
Xavier Giró-i-Nieto
SSeg
24
62
0
02 Dec 2017
Deep Learning Scaling is Predictable, Empirically
Joel Hestness
Sharan Narang
Newsha Ardalani
G. Diamos
Heewoo Jun
Hassan Kianinejad
Md. Mostofa Ali Patwary
Yang Yang
Yanqi Zhou
40
711
0
01 Dec 2017
Recurrent Segmentation for Variable Computational Budgets
Lane T. McIntosh
Niru Maheswaranathan
David Sussillo
Jonathon Shlens
SSeg
VOS
27
20
0
28 Nov 2017
MinimalRNN: Toward More Interpretable and Trainable Recurrent Neural Networks
Minmin Chen
22
28
0
18 Nov 2017
One Model to Rule them all: Multitask and Multilingual Modelling for Lexical Analysis
Johannes Bjerva
24
6
0
03 Nov 2017
Lattice Recurrent Unit: Improving Convergence and Statistical Efficiency for Sequence Modeling
Chaitanya Ahuja
Louis-Philippe Morency
27
4
0
06 Oct 2017
Shifting Mean Activation Towards Zero with Bipolar Activation Functions
L. Eidnes
Arild Nøkland
23
18
0
12 Sep 2017
Dual Rectified Linear Units (DReLUs): A Replacement for Tanh Activation Functions in Quasi-Recurrent Neural Networks
Fréderic Godin
Jonas Degrave
J. Dambre
W. D. Neve
MU
13
46
0
25 Jul 2017
On the State of the Art of Evaluation in Neural Language Models
Gábor Melis
Chris Dyer
Phil Blunsom
19
532
0
18 Jul 2017
Efficient Orthogonal Parametrisation of Recurrent Neural Networks Using Householder Reflections
Zakaria Mhammedi
Andrew D. Hellicar
Ashfaqur Rahman
James Bailey
16
129
0
01 Dec 2016
Input Switched Affine Networks: An RNN Architecture Designed for Interpretability
Jakob N. Foerster
Justin Gilmer
J. Chorowski
Jascha Narain Sohl-Dickstein
David Sussillo
28
7
0
28 Nov 2016
Previous
1
2