Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2312.01324
Cited By
MABViT -- Modified Attention Block Enhances Vision Transformers
3 December 2023
Mahesh Ramesh
Aswinkumar Ramkumar
Re-assign community
ArXiv
PDF
HTML
Papers citing
"MABViT -- Modified Attention Block Enhances Vision Transformers"
3 / 3 papers shown
Title
Activator: GLU Activation Function as the Core Component of a Vision Transformer
Abdullah Nazhat Abdullah
Tarkan Aydin
ViT
43
0
0
24 May 2024
ResiDual: Transformer with Dual Residual Connections
Shufang Xie
Huishuai Zhang
Junliang Guo
Xu Tan
Jiang Bian
Hany Awadalla
Arul Menezes
Tao Qin
Rui Yan
51
18
0
28 Apr 2023
Talking-Heads Attention
Noam M. Shazeer
Zhenzhong Lan
Youlong Cheng
Nan Ding
L. Hou
101
80
0
05 Mar 2020
1