Horizontal and Vertical Attention in Transformers

Horizontal and Vertical Attention in Transformers

    ViT

Papers citing "Horizontal and Vertical Attention in Transformers"

26 / 26 papers shown
Title
Linformer: Self-Attention with Linear Complexity
Linformer: Self-Attention with Linear Complexity
Sinong Wang
Belinda Z. Li
Madian Khabsa
Han Fang
Hao Ma
216
1,713
0
08 Jun 2020
Edinburgh Neural Machine Translation Systems for WMT 16
Edinburgh Neural Machine Translation Systems for WMT 16
Rico Sennrich
Barry Haddow
Alexandra Birch
78
524
0
09 Jun 2016

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. See our policy.