Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2305.14224
Cited By
mmT5: Modular Multilingual Pre-Training Solves Source Language Hallucinations
23 May 2023
Jonas Pfeiffer
Francesco Piccinno
Massimo Nicosia
Xinyi Wang
Machel Reid
Sebastian Ruder
VLM
LRM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"mmT5: Modular Multilingual Pre-Training Solves Source Language Hallucinations"
4 / 54 papers shown
Title
Efficient parametrization of multi-domain deep neural networks
Sylvestre-Alvise Rebuffi
Hakan Bilen
Andrea Vedaldi
OOD
76
365
0
27 Mar 2018
An Overview of Multi-Task Learning in Deep Neural Networks
Sebastian Ruder
CVBM
146
2,826
0
15 Jun 2017
Learning multiple visual domains with residual adapters
Sylvestre-Alvise Rebuffi
Hakan Bilen
Andrea Vedaldi
OOD
160
933
0
22 May 2017
Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer
Noam M. Shazeer
Azalia Mirhoseini
Krzysztof Maziarz
Andy Davis
Quoc V. Le
Geoffrey E. Hinton
J. Dean
MoE
248
2,644
0
23 Jan 2017
Previous
1
2