ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.14224
  4. Cited By
mmT5: Modular Multilingual Pre-Training Solves Source Language
  Hallucinations

mmT5: Modular Multilingual Pre-Training Solves Source Language Hallucinations

23 May 2023
Jonas Pfeiffer
Francesco Piccinno
Massimo Nicosia
Xinyi Wang
Machel Reid
Sebastian Ruder
    VLM
    LRM
ArXivPDFHTML

Papers citing "mmT5: Modular Multilingual Pre-Training Solves Source Language Hallucinations"

4 / 54 papers shown
Title
Efficient parametrization of multi-domain deep neural networks
Efficient parametrization of multi-domain deep neural networks
Sylvestre-Alvise Rebuffi
Hakan Bilen
Andrea Vedaldi
OOD
76
365
0
27 Mar 2018
An Overview of Multi-Task Learning in Deep Neural Networks
An Overview of Multi-Task Learning in Deep Neural Networks
Sebastian Ruder
CVBM
146
2,826
0
15 Jun 2017
Learning multiple visual domains with residual adapters
Learning multiple visual domains with residual adapters
Sylvestre-Alvise Rebuffi
Hakan Bilen
Andrea Vedaldi
OOD
160
933
0
22 May 2017
Outrageously Large Neural Networks: The Sparsely-Gated
  Mixture-of-Experts Layer
Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer
Noam M. Shazeer
Azalia Mirhoseini
Krzysztof Maziarz
Andy Davis
Quoc V. Le
Geoffrey E. Hinton
J. Dean
MoE
248
2,644
0
23 Jan 2017
Previous
12