Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2001.04451
Cited By
Reformer: The Efficient Transformer
13 January 2020
Nikita Kitaev
Lukasz Kaiser
Anselm Levskaya
VLM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Reformer: The Efficient Transformer"
5 / 505 papers shown
Title
Sparse Sinkhorn Attention
Yi Tay
Dara Bahri
Liu Yang
Donald Metzler
Da-Cheng Juan
23
331
0
26 Feb 2020
Fixed Encoder Self-Attention Patterns in Transformer-Based Machine Translation
Alessandro Raganato
Yves Scherrer
Jörg Tiedemann
32
92
0
24 Feb 2020
Multivariate Probabilistic Time Series Forecasting via Conditioned Normalizing Flows
Kashif Rasul
Abdul-Saboor Sheikh
Ingmar Schuster
Urs M. Bergmann
Roland Vollgraf
BDL
AI4TS
AI4CE
24
179
0
14 Feb 2020
Blockwise Self-Attention for Long Document Understanding
J. Qiu
Hao Ma
Omer Levy
Scott Yih
Sinong Wang
Jie Tang
11
252
0
07 Nov 2019
Faster Neural Network Training with Approximate Tensor Operations
Menachem Adelman
Kfir Y. Levy
Ido Hakimi
M. Silberstein
31
26
0
21 May 2018
Previous
1
2
3
...
10
11
9