ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2312.01324
  4. Cited By
MABViT -- Modified Attention Block Enhances Vision Transformers

MABViT -- Modified Attention Block Enhances Vision Transformers

3 December 2023
Mahesh Ramesh
Aswinkumar Ramkumar
ArXivPDFHTML

Papers citing "MABViT -- Modified Attention Block Enhances Vision Transformers"

3 / 3 papers shown
Title
Activator: GLU Activation Function as the Core Component of a Vision
  Transformer
Activator: GLU Activation Function as the Core Component of a Vision Transformer
Abdullah Nazhat Abdullah
Tarkan Aydin
ViT
43
0
0
24 May 2024
ResiDual: Transformer with Dual Residual Connections
ResiDual: Transformer with Dual Residual Connections
Shufang Xie
Huishuai Zhang
Junliang Guo
Xu Tan
Jiang Bian
Hany Awadalla
Arul Menezes
Tao Qin
Rui Yan
51
18
0
28 Apr 2023
Talking-Heads Attention
Talking-Heads Attention
Noam M. Shazeer
Zhenzhong Lan
Youlong Cheng
Nan Ding
L. Hou
101
80
0
05 Mar 2020
1