Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2205.01546
Cited By
Learn To Remember: Transformer with Recurrent Memory for Document-Level Machine Translation
3 May 2022
Yukun Feng
Feng Li
Ziang Song
Boyuan Zheng
Philipp Koehn
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Learn To Remember: Transformer with Recurrent Memory for Document-Level Machine Translation"
23 / 23 papers shown
Title
(Perhaps) Beyond Human Translation: Harnessing Multi-Agent Collaboration for Translating Ultra-Long Literary Texts
Minghao Wu
Jiahao Xu
Yulin Yuan
Gholamreza Haffari
Longyue Wang
Weihua Luo
Kaifu Zhang
LLMAG
171
27
0
20 May 2024
∞
\infty
∞
-former: Infinite Memory Transformer
Pedro Henrique Martins
Zita Marinho
André F. T. Martins
91
11
0
01 Sep 2021
G-Transformer for Document-level Machine Translation
Guangsheng Bao
Yue Zhang
Zhiyang Teng
Boxing Chen
Weihua Luo
50
81
0
31 May 2021
Long-Short Term Masking Transformer: A Simple but Effective Baseline for Document-level Neural Machine Translation
Pei Zhang
Boxing Chen
Niyu Ge
Kai Fan
79
37
0
19 Sep 2020
Big Bird: Transformers for Longer Sequences
Manzil Zaheer
Guru Guruganesh
Kumar Avinava Dubey
Joshua Ainslie
Chris Alberti
...
Philip Pham
Anirudh Ravula
Qifan Wang
Li Yang
Amr Ahmed
VLM
556
2,099
0
28 Jul 2020
Longformer: The Long-Document Transformer
Iz Beltagy
Matthew E. Peters
Arman Cohan
RALM
VLM
179
4,092
0
10 Apr 2020
Context-Aware Monolingual Repair for Neural Machine Translation
Elena Voita
Rico Sennrich
Ivan Titov
48
98
0
03 Sep 2019
Microsoft Translator at WMT 2019: Towards Large-Scale Document-Level Neural Machine Translation
Marcin Junczys-Dowmunt
62
159
0
14 Jul 2019
XLNet: Generalized Autoregressive Pretraining for Language Understanding
Zhilin Yang
Zihang Dai
Yiming Yang
J. Carbonell
Ruslan Salakhutdinov
Quoc V. Le
AI4CE
236
8,447
0
19 Jun 2019
Generating Long Sequences with Sparse Transformers
R. Child
Scott Gray
Alec Radford
Ilya Sutskever
129
1,915
0
23 Apr 2019
Selective Attention for Context-aware Neural Machine Translation
Sameen Maruf
André F. T. Martins
Gholamreza Haffari
69
176
0
21 Mar 2019
Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context
Zihang Dai
Zhilin Yang
Yiming Yang
J. Carbonell
Quoc V. Le
Ruslan Salakhutdinov
VLM
260
3,745
0
09 Jan 2019
Improving the Transformer Translation Model with Document-Level Context
Jiacheng Zhang
Huanbo Luan
Maosong Sun
Feifei Zhai
Jingfang Xu
Min Zhang
Yang Liu
86
255
0
08 Oct 2018
Has Machine Translation Achieved Human Parity? A Case for Document-level Evaluation
Samuel Läubli
Rico Sennrich
M. Volk
47
259
0
21 Aug 2018
Image Transformer
Niki Parmar
Ashish Vaswani
Jakob Uszkoreit
Lukasz Kaiser
Noam M. Shazeer
Alexander Ku
Dustin Tran
ViT
144
1,685
0
15 Feb 2018
Neural Machine Translation with Extended Context
Jörg Tiedemann
Yves Scherrer
56
252
0
20 Aug 2017
Memory-augmented Neural Machine Translation
Yang Feng
Shiyue Zhang
Andi Zhang
Dong Wang
Andrew Abel
74
59
0
07 Aug 2017
Six Challenges for Neural Machine Translation
Philipp Koehn
Rebecca Knowles
AAML
AIMat
377
1,225
0
12 Jun 2017
Axiomatic Attribution for Deep Networks
Mukund Sundararajan
Ankur Taly
Qiqi Yan
OOD
FAtt
193
6,018
0
04 Mar 2017
Neural Machine Translation of Rare Words with Subword Units
Rico Sennrich
Barry Haddow
Alexandra Birch
228
7,757
0
31 Aug 2015
Effective Approaches to Attention-based Neural Machine Translation
Thang Luong
Hieu H. Pham
Christopher D. Manning
413
7,969
0
17 Aug 2015
Overcoming the Curse of Sentence Length for Neural Machine Translation using Automatic Segmentation
Jean Pouget-Abadie
Dzmitry Bahdanau
B. V. Merrienboer
Kyunghyun Cho
Yoshua Bengio
SSeg
95
79
0
03 Sep 2014
Neural Machine Translation by Jointly Learning to Align and Translate
Dzmitry Bahdanau
Kyunghyun Cho
Yoshua Bengio
AIMat
578
27,327
0
01 Sep 2014
1