ResearchTrend.AI
  • Papers
  • Communities
  • Organizations
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.11639
  4. Cited By
In-Context Learning with Transformers: Softmax Attention Adapts to
  Function Lipschitzness
v1v2 (latest)

In-Context Learning with Transformers: Softmax Attention Adapts to Function Lipschitzness

18 February 2024
Liam Collins
Advait Parulekar
Aryan Mokhtari
Sujay Sanghavi
Sanjay Shakkottai
    MLT
ArXiv (abs)PDFHTML

Papers citing "In-Context Learning with Transformers: Softmax Attention Adapts to Function Lipschitzness"

2 / 2 papers shown
Title
When and How Unlabeled Data Provably Improve In-Context Learning
When and How Unlabeled Data Provably Improve In-Context Learning
Yingcong Li
Xiangyu Chang
Muti Kara
Xiaofeng Liu
Amit K. Roy-Chowdhury
Samet Oymak
28
0
0
18 Jun 2025
Transformers are Provably Optimal In-context Estimators for Wireless Communications
Transformers are Provably Optimal In-context Estimators for Wireless Communications
Vishnu Teja Kunde
Vicram Rajagopalan
Chandra Shekhara Kaushik Valmeekam
Krishna R. Narayanan
S. Shakkottai
D. Kalathil
J. Chamberland
158
6
0
01 Nov 2023
1