Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1911.03329
Cited By
Memory-Augmented Recurrent Neural Networks Can Learn Generalized Dyck Languages
8 November 2019
Mirac Suzgun
Sebastian Gehrmann
Yonatan Belinkov
Stuart M. Shieber
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Memory-Augmented Recurrent Neural Networks Can Learn Generalized Dyck Languages"
8 / 8 papers shown
Title
Training Neural Networks as Recognizers of Formal Languages
Alexandra Butoi
Ghazal Khalighinejad
Anej Svete
Josef Valvoda
Ryan Cotterell
Brian DuSell
NAI
44
2
0
11 Nov 2024
Separations in the Representational Capabilities of Transformers and Recurrent Architectures
S. Bhattamishra
Michael Hahn
Phil Blunsom
Varun Kanade
GNN
44
9
0
13 Jun 2024
Learning Universal Predictors
Jordi Grau-Moya
Tim Genewein
Marcus Hutter
Laurent Orseau
Grégoire Delétang
...
Anian Ruoss
Wenliang Kevin Li
Christopher Mattern
Matthew Aitchison
J. Veness
27
11
0
26 Jan 2024
Theoretical Conditions and Empirical Failure of Bracket Counting on Long Sequences with Linear Recurrent Networks
Nadine El-Naggar
Pranava Madhyastha
Tillman Weyde
14
1
0
07 Apr 2023
Minimum Description Length Recurrent Neural Networks
N. Lan
Michal Geyer
Emmanuel Chemla
Roni Katzir
16
12
0
31 Oct 2021
Causal Transformers Perform Below Chance on Recursive Nested Constructions, Unlike Humans
Yair Lakretz
T. Desbordes
Dieuwke Hupkes
S. Dehaene
233
11
0
14 Oct 2021
Formal Language Theory Meets Modern NLP
William Merrill
AI4CE
NAI
16
12
0
19 Feb 2021
Can RNNs learn Recursive Nested Subject-Verb Agreements?
Yair Lakretz
T. Desbordes
J. King
Benoît Crabbé
Maxime Oquab
S. Dehaene
160
19
0
06 Jan 2021
1