ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2010.04887
  4. Cited By
Discourse structure interacts with reference but not syntax in neural
  language models

Discourse structure interacts with reference but not syntax in neural language models

10 October 2020
Forrest Davis
Marten van Schijndel
ArXivPDFHTML

Papers citing "Discourse structure interacts with reference but not syntax in neural language models"

12 / 12 papers shown
Title
Linguistic Interpretability of Transformer-based Language Models: a systematic review
Linguistic Interpretability of Transformer-based Language Models: a systematic review
Miguel López-Otal
Jorge Gracia
Jordi Bernad
Carlos Bobed
Lucía Pitarch-Ballesteros
Emma Anglés-Herrero
VLM
91
1
0
09 Apr 2025
A Systematic Assessment of Syntactic Generalization in Neural Language
  Models
A Systematic Assessment of Syntactic Generalization in Neural Language Models
Jennifer Hu
Jon Gauthier
Peng Qian
Ethan Gotlieb Wilcox
R. Levy
ELM
73
220
0
07 May 2020
Using Priming to Uncover the Organization of Syntactic Representations
  in Neural Language Models
Using Priming to Uncover the Organization of Syntactic Representations in Neural Language Models
Grusha Prasad
Marten van Schijndel
Tal Linzen
62
52
0
23 Sep 2019
Analysing Neural Language Models: Contextual Decomposition Reveals
  Default Reasoning in Number and Gender Assignment
Analysing Neural Language Models: Contextual Decomposition Reveals Default Reasoning in Number and Gender Assignment
Jaap Jumelet
Willem H. Zuidema
Dieuwke Hupkes
LRM
58
37
0
19 Sep 2019
Hierarchical Representation in Neural Language Models: Suppression and
  Recovery of Expectations
Hierarchical Representation in Neural Language Models: Suppression and Recovery of Expectations
Ethan Gotlieb Wilcox
R. Levy
Richard Futrell
MILM
38
30
0
10 Jun 2019
What Syntactic Structures block Dependencies in RNN Language Models?
What Syntactic Structures block Dependencies in RNN Language Models?
Ethan Gotlieb Wilcox
R. Levy
Richard Futrell
36
24
0
24 May 2019
Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context
Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context
Zihang Dai
Zhilin Yang
Yiming Yang
J. Carbonell
Quoc V. Le
Ruslan Salakhutdinov
VLM
222
3,726
0
09 Jan 2019
RNNs as psycholinguistic subjects: Syntactic state and grammatical
  dependency
RNNs as psycholinguistic subjects: Syntactic state and grammatical dependency
Richard Futrell
Ethan Gotlieb Wilcox
Takashi Morita
R. Levy
50
57
0
05 Sep 2018
Colorless green recurrent networks dream hierarchically
Colorless green recurrent networks dream hierarchically
Kristina Gulordava
Piotr Bojanowski
Edouard Grave
Tal Linzen
Marco Baroni
88
504
0
29 Mar 2018
Deep contextualized word representations
Deep contextualized word representations
Matthew E. Peters
Mark Neumann
Mohit Iyyer
Matt Gardner
Christopher Clark
Kenton Lee
Luke Zettlemoyer
NAI
206
11,549
0
15 Feb 2018
Exploring the Syntactic Abilities of RNNs with Multi-task Learning
Exploring the Syntactic Abilities of RNNs with Multi-task Learning
Émile Enguehard
Yoav Goldberg
Tal Linzen
54
28
0
12 Jun 2017
Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies
Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies
Tal Linzen
Emmanuel Dupoux
Yoav Goldberg
101
903
0
04 Nov 2016
1