ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2105.03953
  4. Cited By
Continual Mixed-Language Pre-Training for Extremely Low-Resource Neural
  Machine Translation

Continual Mixed-Language Pre-Training for Extremely Low-Resource Neural Machine Translation

9 May 2021
Zihan Liu
Genta Indra Winata
Pascale Fung
    VLM
    CLL
ArXivPDFHTML

Papers citing "Continual Mixed-Language Pre-Training for Extremely Low-Resource Neural Machine Translation"

11 / 11 papers shown
Title
From Priest to Doctor: Domain Adaptation for Low-Resource Neural Machine Translation
From Priest to Doctor: Domain Adaptation for Low-Resource Neural Machine Translation
Ali Marashian
Enora Rice
Luke Gessler
Alexis Palmer
K. Wense
81
1
0
24 Feb 2025
A Comprehensive Evaluation of Parameter-Efficient Fine-Tuning on
  Software Engineering Tasks
A Comprehensive Evaluation of Parameter-Efficient Fine-Tuning on Software Engineering Tasks
Wentao Zou
Qi Li
Jidong Ge
Chuanyi Li
Xiaoyu Shen
LiGuo Huang
Bin Luo
26
5
0
25 Dec 2023
Bilex Rx: Lexical Data Augmentation for Massively Multilingual Machine
  Translation
Bilex Rx: Lexical Data Augmentation for Massively Multilingual Machine Translation
Alex Jones
Isaac Caswell
Ishan Saxena
Orhan Firat
23
8
0
27 Mar 2023
Towards Geospatial Foundation Models via Continual Pretraining
Towards Geospatial Foundation Models via Continual Pretraining
Matías Mendieta
Boran Han
Xingjian Shi
Yi Zhu
Chen Chen
VLM
AI4CE
43
64
0
09 Feb 2023
Continual Learning of Neural Machine Translation within Low Forgetting
  Risk Regions
Continual Learning of Neural Machine Translation within Low Forgetting Risk Regions
Shuhao Gu
Bojie Hu
Yang Feng
CLL
35
11
0
03 Nov 2022
Pre-Trained Multilingual Sequence-to-Sequence Models: A Hope for
  Low-Resource Language Translation?
Pre-Trained Multilingual Sequence-to-Sequence Models: A Hope for Low-Resource Language Translation?
E. Lee
Sarubi Thillainathan
Shravan Nayak
Surangika Ranathunga
David Ifeoluwa Adelani
Ruisi Su
Arya D. McCarthy
VLM
21
43
0
16 Mar 2022
NL-Augmenter: A Framework for Task-Sensitive Natural Language
  Augmentation
NL-Augmenter: A Framework for Task-Sensitive Natural Language Augmentation
Kaustubh D. Dhole
Varun Gangal
Sebastian Gehrmann
Aadesh Gupta
Zhenhao Li
...
Tianbao Xie
Usama Yaseen
Michael A. Yee
Jing Zhang
Yue Zhang
174
86
0
06 Dec 2021
AfroMT: Pretraining Strategies and Reproducible Benchmarks for
  Translation of 8 African Languages
AfroMT: Pretraining Strategies and Reproducible Benchmarks for Translation of 8 African Languages
Machel Reid
Junjie Hu
Graham Neubig
Y. Matsuo
77
31
0
10 Sep 2021
CrossNER: Evaluating Cross-Domain Named Entity Recognition
CrossNER: Evaluating Cross-Domain Named Entity Recognition
Zihan Liu
Yan Xu
Tiezheng Yu
Wenliang Dai
Ziwei Ji
Samuel Cahyawijaya
Andrea Madotto
Pascale Fung
76
142
0
08 Dec 2020
Word Translation Without Parallel Data
Word Translation Without Parallel Data
Alexis Conneau
Guillaume Lample
MarcÁurelio Ranzato
Ludovic Denoyer
Hervé Jégou
186
1,635
0
11 Oct 2017
Effective Approaches to Attention-based Neural Machine Translation
Effective Approaches to Attention-based Neural Machine Translation
Thang Luong
Hieu H. Pham
Christopher D. Manning
218
7,926
0
17 Aug 2015
1