Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2110.07240
Cited By
Causal Transformers Perform Below Chance on Recursive Nested Constructions, Unlike Humans
14 October 2021
Yair Lakretz
T. Desbordes
Dieuwke Hupkes
S. Dehaene
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Causal Transformers Perform Below Chance on Recursive Nested Constructions, Unlike Humans"
8 / 8 papers shown
Title
Can Transformers Do Enumerative Geometry?
Baran Hashemi
Roderic G. Corominas
Alessandro Giacchetto
39
2
0
27 Aug 2024
Language acquisition: do children and language models follow similar learning stages?
Linnea Evanson
Yair Lakretz
J. King
27
26
0
06 Jun 2023
Dissociating language and thought in large language models
Kyle Mahowald
Anna A. Ivanova
I. Blank
Nancy Kanwisher
J. Tenenbaum
Evelina Fedorenko
ELM
ReLM
25
209
0
16 Jan 2023
Probing for Incremental Parse States in Autoregressive Language Models
Tiwalayo Eisape
Vineet Gangireddy
R. Levy
Yoon Kim
25
11
0
17 Nov 2022
State-of-the-art generalisation research in NLP: A taxonomy and review
Dieuwke Hupkes
Mario Giulianelli
Verna Dankers
Mikel Artetxe
Yanai Elazar
...
Leila Khalatbari
Maria Ryskina
Rita Frieske
Ryan Cotterell
Zhijing Jin
114
93
0
06 Oct 2022
Using cognitive psychology to understand GPT-3
Marcel Binz
Eric Schulz
ELM
LLMAG
250
440
0
21 Jun 2022
Transformers Generalize Linearly
Jackson Petty
Robert Frank
AI4CE
216
16
0
24 Sep 2021
Can RNNs learn Recursive Nested Subject-Verb Agreements?
Yair Lakretz
T. Desbordes
J. King
Benoît Crabbé
Maxime Oquab
S. Dehaene
160
19
0
06 Jan 2021
1