Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2310.03817
Cited By
Logical Languages Accepted by Transformer Encoders with Hard Attention
5 October 2023
Pablo Barceló
Alexander Kozachinskiy
Anthony Widjaja Lin
Vladimir Podolskii
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Logical Languages Accepted by Transformer Encoders with Hard Attention"
6 / 6 papers shown
Title
Rosetta-PL: Propositional Logic as a Benchmark for Large Language Model Reasoning
Shaun Baek
Shaun Esua-Mensah
Cyrus Tsui
Sejan Vigneswaralingam
Abdullah Alali
Michael Lu
Vasu Sharma
Sean O'Brien
Kevin Zhu
LRM
140
0
0
25 Mar 2025
Lower Bounds for Chain-of-Thought Reasoning in Hard-Attention Transformers
Alireza Amiri
Xinting Huang
Mark Rofin
Michael Hahn
LRM
489
1
0
04 Feb 2025
Tighter Bounds on the Expressivity of Transformer Encoders
David Chiang
Peter A. Cholak
A. Pillay
79
58
0
25 Jan 2023
Formal Language Recognition by Hard Attention Transformers: Perspectives from Circuit Complexity
Sophie Hao
Dana Angluin
Robert Frank
55
78
0
13 Apr 2022
Overcoming a Theoretical Limitation of Self-Attention
David Chiang
Peter A. Cholak
78
84
0
24 Feb 2022
Theoretical Limitations of Self-Attention in Neural Sequence Models
Michael Hahn
68
271
0
16 Jun 2019
1