Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2408.12186
Cited By
Transformers are Minimax Optimal Nonparametric In-Context Learners
22 August 2024
Juno Kim
Tai Nakamaki
Taiji Suzuki
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Transformers are Minimax Optimal Nonparametric In-Context Learners"
7 / 7 papers shown
Title
When Do Transformers Outperform Feedforward and Recurrent Networks? A Statistical Perspective
Alireza Mousavi-Hosseini
Clayton Sanford
Denny Wu
Murat A. Erdogdu
48
0
0
14 Mar 2025
Pretrained transformer efficiently learns low-dimensional target functions in-context
Kazusato Oko
Yujin Song
Taiji Suzuki
Denny Wu
41
4
0
04 Nov 2024
Provable optimal transport with transformers: The essence of depth and prompt engineering
Hadi Daneshmand
OT
40
0
0
25 Oct 2024
Active-Dormant Attention Heads: Mechanistically Demystifying Extreme-Token Phenomena in LLMs
Tianyu Guo
Druv Pai
Yu Bai
Jiantao Jiao
Michael I. Jordan
Song Mei
29
10
0
17 Oct 2024
Large Language Models as Markov Chains
Oussama Zekri
Ambroise Odonnat
Abdelhakim Benechehab
Linus Bleistein
Nicolas Boullé
I. Redko
48
10
0
03 Oct 2024
Trained Transformer Classifiers Generalize and Exhibit Benign Overfitting In-Context
Spencer Frei
Gal Vardi
MLT
28
3
0
02 Oct 2024
Provable In-Context Learning of Linear Systems and Linear Elliptic PDEs with Transformers
Frank Cole
Yulong Lu
Riley OÑeill
Tianhao Zhang
48
2
0
18 Sep 2024
1