Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2205.06266
Cited By
Lifting the Curse of Multilinguality by Pre-training Modular Transformers
12 May 2022
Jonas Pfeiffer
Naman Goyal
Xi Lin
Xian Li
James Cross
Sebastian Riedel
Mikel Artetxe
LRM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Lifting the Curse of Multilinguality by Pre-training Modular Transformers"
8 / 58 papers shown
Title
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Jacob Devlin
Ming-Wei Chang
Kenton Lee
Kristina Toutanova
VLM
SSL
SSeg
1.6K
94,511
0
11 Oct 2018
XNLI: Evaluating Cross-lingual Sentence Representations
Alexis Conneau
Guillaume Lample
Ruty Rinott
Adina Williams
Samuel R. Bowman
Holger Schwenk
Veselin Stoyanov
ELM
55
1,379
0
13 Sep 2018
Attention Is All You Need
Ashish Vaswani
Noam M. Shazeer
Niki Parmar
Jakob Uszkoreit
Llion Jones
Aidan Gomez
Lukasz Kaiser
Illia Polosukhin
3DV
646
130,942
0
12 Jun 2017
Learning multiple visual domains with residual adapters
Sylvestre-Alvise Rebuffi
Hakan Bilen
Andrea Vedaldi
OOD
152
931
0
22 May 2017
A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference
Adina Williams
Nikita Nangia
Samuel R. Bowman
507
4,473
0
18 Apr 2017
Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer
Noam M. Shazeer
Azalia Mirhoseini
Krzysztof Maziarz
Andy Davis
Quoc V. Le
Geoffrey E. Hinton
J. Dean
MoE
235
2,635
0
23 Jan 2017
SQuAD: 100,000+ Questions for Machine Comprehension of Text
Pranav Rajpurkar
Jian Zhang
Konstantin Lopyrev
Percy Liang
RALM
241
8,113
0
16 Jun 2016
Learning to Compose Neural Networks for Question Answering
Jacob Andreas
Marcus Rohrbach
Trevor Darrell
Dan Klein
NAI
KELM
BDL
CoGe
96
565
0
07 Jan 2016
Previous
1
2