Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2004.14788
Cited By
Character-Level Translation with Self-attention
30 April 2020
Yingqiang Gao
Nikola I. Nikolov
Yuhuang Hu
Richard H. R. Hahnloser
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Character-Level Translation with Self-attention"
10 / 10 papers shown
Title
Deterministic Reversible Data Augmentation for Neural Machine Translation
Jiashu Yao
Heyan Huang
Zeming Liu
Yuhang Guo
51
0
0
21 Feb 2025
Character-level NMT and language similarity
Josef Jon
Ondrej Bojar
27
0
0
08 Aug 2023
TranSFormer: Slow-Fast Transformer for Machine Translation
Bei Li
Yi Jing
Xu Tan
Zhen Xing
Tong Xiao
Jingbo Zhu
49
7
0
26 May 2023
Language Model Tokenizers Introduce Unfairness Between Languages
Aleksandar Petrov
Emanuele La Malfa
Philip Torr
Adel Bibi
52
98
0
17 May 2023
The boundaries of meaning: a case study in neural machine translation
Yuri Balashov
24
2
0
02 Oct 2022
Between words and characters: A Brief History of Open-Vocabulary Modeling and Tokenization in NLP
Sabrina J. Mielke
Zaid Alyafeai
Elizabeth Salesky
Colin Raffel
Manan Dey
...
Arun Raja
Chenglei Si
Wilson Y. Lee
Benoît Sagot
Samson Tan
34
143
0
20 Dec 2021
Why don't people use character-level machine translation?
Jindrich Libovický
Helmut Schmid
Alexander Fraser
65
28
0
15 Oct 2021
Neural Machine Translation: A Review of Methods, Resources, and Tools
Zhixing Tan
Shuo Wang
Zonghan Yang
Gang Chen
Xuancheng Huang
Maosong Sun
Yang Liu
3DV
AI4TS
30
105
0
31 Dec 2020
Can Sequence-to-Sequence Models Crack Substitution Ciphers?
Nada Aldarrab
Jonathan May
27
7
0
30 Dec 2020
Multi-Way, Multilingual Neural Machine Translation with a Shared Attention Mechanism
Orhan Firat
Kyunghyun Cho
Yoshua Bengio
LRM
AIMat
234
624
0
06 Jan 2016
1