Papers
Communities
Organizations
Events
Blog
Pricing
Search
Open menu
Home
Papers
2402.11639
Cited By
v1
v2 (latest)
In-Context Learning with Transformers: Softmax Attention Adapts to Function Lipschitzness
18 February 2024
Liam Collins
Advait Parulekar
Aryan Mokhtari
Sujay Sanghavi
Sanjay Shakkottai
MLT
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"In-Context Learning with Transformers: Softmax Attention Adapts to Function Lipschitzness"
2 / 2 papers shown
Title
When and How Unlabeled Data Provably Improve In-Context Learning
Yingcong Li
Xiangyu Chang
Muti Kara
Xiaofeng Liu
Amit K. Roy-Chowdhury
Samet Oymak
28
0
0
18 Jun 2025
Transformers are Provably Optimal In-context Estimators for Wireless Communications
Vishnu Teja Kunde
Vicram Rajagopalan
Chandra Shekhara Kaushik Valmeekam
Krishna R. Narayanan
S. Shakkottai
D. Kalathil
J. Chamberland
158
6
0
01 Nov 2023
1