Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2104.08771
Cited By
Cross-Attention is All You Need: Adapting Pretrained Transformers for Machine Translation
18 April 2021
Mozhdeh Gheini
Xiang Ren
Jonathan May
LRM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Cross-Attention is All You Need: Adapting Pretrained Transformers for Machine Translation"
3 / 53 papers shown
Title
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
280
3,858
0
18 Apr 2021
WARP: Word-level Adversarial ReProgramming
Karen Hambardzumyan
Hrant Khachatrian
Jonathan May
AAML
254
342
0
01 Jan 2021
Word Translation Without Parallel Data
Alexis Conneau
Guillaume Lample
MarcÁurelio Ranzato
Ludovic Denoyer
Hervé Jégou
189
1,639
0
11 Oct 2017
Previous
1
2