ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2001.04451
  4. Cited By
Reformer: The Efficient Transformer

Reformer: The Efficient Transformer

13 January 2020
Nikita Kitaev
Lukasz Kaiser
Anselm Levskaya
    VLM
ArXivPDFHTML

Papers citing "Reformer: The Efficient Transformer"

5 / 505 papers shown
Title
Sparse Sinkhorn Attention
Sparse Sinkhorn Attention
Yi Tay
Dara Bahri
Liu Yang
Donald Metzler
Da-Cheng Juan
23
331
0
26 Feb 2020
Fixed Encoder Self-Attention Patterns in Transformer-Based Machine
  Translation
Fixed Encoder Self-Attention Patterns in Transformer-Based Machine Translation
Alessandro Raganato
Yves Scherrer
Jörg Tiedemann
32
92
0
24 Feb 2020
Multivariate Probabilistic Time Series Forecasting via Conditioned
  Normalizing Flows
Multivariate Probabilistic Time Series Forecasting via Conditioned Normalizing Flows
Kashif Rasul
Abdul-Saboor Sheikh
Ingmar Schuster
Urs M. Bergmann
Roland Vollgraf
BDL
AI4TS
AI4CE
24
179
0
14 Feb 2020
Blockwise Self-Attention for Long Document Understanding
Blockwise Self-Attention for Long Document Understanding
J. Qiu
Hao Ma
Omer Levy
Scott Yih
Sinong Wang
Jie Tang
11
252
0
07 Nov 2019
Faster Neural Network Training with Approximate Tensor Operations
Faster Neural Network Training with Approximate Tensor Operations
Menachem Adelman
Kfir Y. Levy
Ido Hakimi
M. Silberstein
31
26
0
21 May 2018
Previous
123...10119