ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1809.02836
  4. Cited By
Context-Free Transductions with Neural Stacks

Context-Free Transductions with Neural Stacks

8 September 2018
Sophie Hao
William Merrill
Dana Angluin
Robert Frank
Noah Amsel
Andrew Benz
S. Mendelsohn
    GNN
ArXiv (abs)PDFHTML

Papers citing "Context-Free Transductions with Neural Stacks"

15 / 15 papers shown
Title
On the Representational Capacity of Neural Language Models with Chain-of-Thought Reasoning
On the Representational Capacity of Neural Language Models with Chain-of-Thought Reasoning
Franz Nowak
Anej Svete
Alexandra Butoi
Ryan Cotterell
ReLMLRM
115
17
0
20 Jun 2024
What Languages are Easy to Language-Model? A Perspective from Learning Probabilistic Regular Languages
What Languages are Easy to Language-Model? A Perspective from Learning Probabilistic Regular Languages
Nadav Borenstein
Anej Svete
R. Chan
Josef Valvoda
Franz Nowak
Isabelle Augenstein
Eleanor Chodroff
Ryan Cotterell
114
13
0
06 Jun 2024
Enhancing Length Extrapolation in Sequential Models with
  Pointer-Augmented Neural Memory
Enhancing Length Extrapolation in Sequential Models with Pointer-Augmented Neural Memory
Hung Le
D. Nguyen
Kien Do
Svetha Venkatesh
T. Tran
57
0
0
18 Apr 2024
Recurrent Neural Language Models as Probabilistic Finite-state Automata
Recurrent Neural Language Models as Probabilistic Finite-state Automata
Anej Svete
Ryan Cotterell
89
2
0
08 Oct 2023
State-Regularized Recurrent Neural Networks to Extract Automata and
  Explain Predictions
State-Regularized Recurrent Neural Networks to Extract Automata and Explain Predictions
Cheng Wang
Carolin (Haas) Lawrence
Mathias Niepert
69
3
0
10 Dec 2022
Neural Networks and the Chomsky Hierarchy
Neural Networks and the Chomsky Hierarchy
Grégoire Delétang
Anian Ruoss
Jordi Grau-Moya
Tim Genewein
L. Wenliang
...
Chris Cundy
Marcus Hutter
Shane Legg
Joel Veness
Pedro A. Ortega
UQCV
200
151
0
05 Jul 2022
Formal Language Theory Meets Modern NLP
Formal Language Theory Meets Modern NLP
William Merrill
AI4CENAI
112
13
0
19 Feb 2021
RNNs can generate bounded hierarchical languages with optimal memory
RNNs can generate bounded hierarchical languages with optimal memory
John Hewitt
Michael Hahn
Surya Ganguli
Percy Liang
Christopher D. Manning
LRM
77
54
0
15 Oct 2020
A provably stable neural network Turing Machine
A provably stable neural network Turing Machine
J. Stogin
A. Mali
L. Giles
59
6
0
05 Jun 2020
Memory-Augmented Recurrent Neural Networks Can Learn Generalized Dyck
  Languages
Memory-Augmented Recurrent Neural Networks Can Learn Generalized Dyck Languages
Mirac Suzgun
Sebastian Gehrmann
Yonatan Belinkov
Stuart M. Shieber
69
50
0
08 Nov 2019
LSTM Networks Can Perform Dynamic Counting
LSTM Networks Can Perform Dynamic Counting
Mirac Suzgun
Sebastian Gehrmann
Yonatan Belinkov
Stuart M. Shieber
86
75
0
09 Jun 2019
Sequential Neural Networks as Automata
Sequential Neural Networks as Automata
William Merrill
82
78
0
04 Jun 2019
Finding Syntactic Representations in Neural Stacks
Finding Syntactic Representations in Neural Stacks
William Merrill
Lenny Khazan
Noah Amsel
Sophie Hao
S. Mendelsohn
Robert Frank
44
6
0
04 Jun 2019
Analyzing and Interpreting Neural Networks for NLP: A Report on the
  First BlackboxNLP Workshop
Analyzing and Interpreting Neural Networks for NLP: A Report on the First BlackboxNLP Workshop
Afra Alishahi
Grzegorz Chrupała
Tal Linzen
NAIMILM
87
65
0
05 Apr 2019
State-Regularized Recurrent Neural Networks
State-Regularized Recurrent Neural Networks
Cheng Wang
Mathias Niepert
70
40
0
25 Jan 2019
1