Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1706.03542
Cited By
Exploring the Syntactic Abilities of RNNs with Multi-task Learning
12 June 2017
Émile Enguehard
Yoav Goldberg
Tal Linzen
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Exploring the Syntactic Abilities of RNNs with Multi-task Learning"
9 / 9 papers shown
Title
Syntactic Surprisal From Neural Models Predicts, But Underestimates, Human Processing Difficulty From Syntactic Ambiguities
Suhas Arehalli
Brian Dillon
Tal Linzen
36
37
0
21 Oct 2022
Does syntax need to grow on trees? Sources of hierarchical inductive bias in sequence-to-sequence networks
R. Thomas McCoy
Robert Frank
Tal Linzen
25
106
0
10 Jan 2020
Using Priming to Uncover the Organization of Syntactic Representations in Neural Language Models
Grusha Prasad
Marten van Schijndel
Tal Linzen
40
51
0
23 Sep 2019
What Should/Do/Can LSTMs Learn When Parsing Auxiliary Verb Constructions?
Miryam de Lhoneux
Sara Stymne
Joakim Nivre
12
3
0
18 Jul 2019
Scalable Syntax-Aware Language Models Using Knowledge Distillation
A. Kuncoro
Chris Dyer
Laura Rimell
S. Clark
Phil Blunsom
35
26
0
14 Jun 2019
Neural Language Models as Psycholinguistic Subjects: Representations of Syntactic State
Richard Futrell
Ethan Gotlieb Wilcox
Takashi Morita
Peng Qian
Miguel Ballesteros
R. Levy
MILM
42
191
0
08 Mar 2019
Targeted Syntactic Evaluation of Language Models
Rebecca Marvin
Tal Linzen
31
408
0
27 Aug 2018
Are All Languages Equally Hard to Language-Model?
Ryan Cotterell
Sabrina J. Mielke
Jason Eisner
Brian Roark
21
94
0
10 Jun 2018
Colorless green recurrent networks dream hierarchically
Kristina Gulordava
Piotr Bojanowski
Edouard Grave
Tal Linzen
Marco Baroni
43
502
0
29 Mar 2018
1