ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2009.07610
  4. Cited By
Reusing a Pretrained Language Model on Languages with Limited Corpora
  for Unsupervised NMT

Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT

16 September 2020
Alexandra Chronopoulou
Dario Stojanovski
Alexander Fraser
ArXivPDFHTML

Papers citing "Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT"

17 / 17 papers shown
Title
mmT5: Modular Multilingual Pre-Training Solves Source Language
  Hallucinations
mmT5: Modular Multilingual Pre-Training Solves Source Language Hallucinations
Jonas Pfeiffer
Francesco Piccinno
Massimo Nicosia
Xinyi Wang
Machel Reid
Sebastian Ruder
VLM
LRM
36
27
0
23 May 2023
Mini-Model Adaptation: Efficiently Extending Pretrained Models to New
  Languages via Aligned Shallow Training
Mini-Model Adaptation: Efficiently Extending Pretrained Models to New Languages via Aligned Shallow Training
Kelly Marchisio
Patrick Lewis
Yihong Chen
Mikel Artetxe
35
16
0
20 Dec 2022
High-Resource Methodological Bias in Low-Resource Investigations
High-Resource Methodological Bias in Low-Resource Investigations
Maartje ter Hoeve
David Grangier
Natalie Schluter
33
2
0
14 Nov 2022
Language Generation Models Can Cause Harm: So What Can We Do About It?
  An Actionable Survey
Language Generation Models Can Cause Harm: So What Can We Do About It? An Actionable Survey
Sachin Kumar
Vidhisha Balachandran
Lucille Njoo
Antonios Anastasopoulos
Yulia Tsvetkov
ELM
77
85
0
14 Oct 2022
Language-Family Adapters for Low-Resource Multilingual Neural Machine
  Translation
Language-Family Adapters for Low-Resource Multilingual Neural Machine Translation
Alexandra Chronopoulou
Dario Stojanovski
Alexander Fraser
29
17
0
30 Sep 2022
Training a T5 Using Lab-sized Resources
Training a T5 Using Lab-sized Resources
Manuel R. Ciosici
Leon Derczynski
VLM
36
8
0
25 Aug 2022
Lifting the Curse of Multilinguality by Pre-training Modular
  Transformers
Lifting the Curse of Multilinguality by Pre-training Modular Transformers
Jonas Pfeiffer
Naman Goyal
Xi Lin
Xian Li
James Cross
Sebastian Riedel
Mikel Artetxe
LRM
40
139
0
12 May 2022
Oolong: Investigating What Makes Transfer Learning Hard with Controlled
  Studies
Oolong: Investigating What Makes Transfer Learning Hard with Controlled Studies
Zhengxuan Wu
Alex Tamkin
Isabel Papadimitriou
21
10
0
24 Feb 2022
Exploiting Curriculum Learning in Unsupervised Neural Machine
  Translation
Exploiting Curriculum Learning in Unsupervised Neural Machine Translation
Jinliang Lu
Jiajun Zhang
29
7
0
23 Sep 2021
Subword Mapping and Anchoring across Languages
Subword Mapping and Anchoring across Languages
Giorgos Vernikos
Andrei Popescu-Belis
70
12
0
09 Sep 2021
Neural Machine Translation for Low-Resource Languages: A Survey
Neural Machine Translation for Low-Resource Languages: A Survey
Surangika Ranathunga
E. Lee
Marjana Prifti Skenduli
Ravi Shekhar
Mehreen Alam
Rishemjit Kaur
38
236
0
29 Jun 2021
Machine Translation into Low-resource Language Varieties
Machine Translation into Low-resource Language Varieties
Sachin Kumar
Antonios Anastasopoulos
S. Wintner
Yulia Tsvetkov
11
29
0
12 Jun 2021
Crosslingual Embeddings are Essential in UNMT for Distant Languages: An
  English to IndoAryan Case Study
Crosslingual Embeddings are Essential in UNMT for Distant Languages: An English to IndoAryan Case Study
Tamali Banerjee
V. Rudra Murthy
P. Bhattacharyya
32
9
0
09 Jun 2021
Family of Origin and Family of Choice: Massively Parallel Lexiconized
  Iterative Pretraining for Severely Low Resource Machine Translation
Family of Origin and Family of Choice: Massively Parallel Lexiconized Iterative Pretraining for Severely Low Resource Machine Translation
Zhong Zhou
Alexander Waibel
16
4
0
12 Apr 2021
Improving the Lexical Ability of Pretrained Language Models for
  Unsupervised Neural Machine Translation
Improving the Lexical Ability of Pretrained Language Models for Unsupervised Neural Machine Translation
Alexandra Chronopoulou
Dario Stojanovski
Alexander Fraser
SSL
37
26
0
18 Mar 2021
Towards Continual Learning for Multilingual Machine Translation via
  Vocabulary Substitution
Towards Continual Learning for Multilingual Machine Translation via Vocabulary Substitution
Xavier Garcia
Noah Constant
Ankur P. Parikh
Orhan Firat
48
42
0
11 Mar 2021
The LMU Munich System for the WMT 2020 Unsupervised Machine Translation
  Shared Task
The LMU Munich System for the WMT 2020 Unsupervised Machine Translation Shared Task
Alexandra Chronopoulou
Dario Stojanovski
Viktor Hangya
Alexander Fraser
37
5
0
25 Oct 2020
1