Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2505.21535
Cited By
v1
v2 (latest)
Is Attention Required for Transformer Inference? Explore Function-preserving Attention Replacement
24 May 2025
Yuxin Ren
Maxwell D Collins
Miao Hu
Huanrui Yang
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Is Attention Required for Transformer Inference? Explore Function-preserving Attention Replacement"
1 / 1 papers shown
Title
On the Relationship between Self-Attention and Convolutional Layers
Jean-Baptiste Cordonnier
Andreas Loukas
Martin Jaggi
143
535
0
08 Nov 2019
1