Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2103.01075
Cited By
OmniNet: Omnidirectional Representations from Transformers
1 March 2021
Yi Tay
Mostafa Dehghani
V. Aribandi
Jai Gupta
Philip Pham
Zhen Qin
Dara Bahri
Da-Cheng Juan
Donald Metzler
Re-assign community
ArXiv
PDF
HTML
Papers citing
"OmniNet: Omnidirectional Representations from Transformers"
10 / 10 papers shown
Title
MUDDFormer: Breaking Residual Bottlenecks in Transformers via Multiway Dynamic Dense Connections
Da Xiao
Qingye Meng
Shengping Li
Xingyuan Yuan
MoE
AI4CE
66
1
0
13 Feb 2025
CUPID: Improving Battle Fairness and Position Satisfaction in Online MOBA Games with a Re-matchmaking System
Ge Fan
Chaoyun Zhang
Kai Wang
Yingjie Li
Junyang Chen
Zenglin Xu
36
1
0
28 Jun 2024
Cached Transformers: Improving Transformers with Differentiable Memory Cache
Zhaoyang Zhang
Wenqi Shao
Yixiao Ge
Xiaogang Wang
Liang Feng
Ping Luo
16
2
0
20 Dec 2023
Adaptive Cross-Layer Attention for Image Restoration
Yancheng Wang
N. Xu
Yingzhen Yang
21
3
0
04 Mar 2022
ViNMT: Neural Machine Translation Toolkit
Nguyen Hoang Quan
N. T. Dat
Nguyen Hoang Minh Cong
Nguyen Van Vinh
Ngo Thi Vinh
N. Thai
T. Viet
19
2
0
31 Dec 2021
The Efficiency Misnomer
Daoyuan Chen
Liuyi Yao
Dawei Gao
Ashish Vaswani
Yaliang Li
34
99
0
25 Oct 2021
Exploring the Limits of Large Scale Pre-training
Samira Abnar
Mostafa Dehghani
Behnam Neyshabur
Hanie Sedghi
AI4CE
60
114
0
05 Oct 2021
KVT: k-NN Attention for Boosting Vision Transformers
Pichao Wang
Xue Wang
F. Wang
Ming Lin
Shuning Chang
Hao Li
R. L. Jin
ViT
51
105
0
28 May 2021
Big Bird: Transformers for Longer Sequences
Manzil Zaheer
Guru Guruganesh
Kumar Avinava Dubey
Joshua Ainslie
Chris Alberti
...
Philip Pham
Anirudh Ravula
Qifan Wang
Li Yang
Amr Ahmed
VLM
280
2,015
0
28 Jul 2020
A Decomposable Attention Model for Natural Language Inference
Ankur P. Parikh
Oscar Täckström
Dipanjan Das
Jakob Uszkoreit
201
1,367
0
06 Jun 2016
1