ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.14911
  4. Cited By
Recipes for Adapting Pre-trained Monolingual and Multilingual Models to
  Machine Translation

Recipes for Adapting Pre-trained Monolingual and Multilingual Models to Machine Translation

30 April 2020
Asa Cooper Stickland
Xian Li
Marjan Ghazvininejad
ArXivPDFHTML

Papers citing "Recipes for Adapting Pre-trained Monolingual and Multilingual Models to Machine Translation"

14 / 14 papers shown
Title
From Priest to Doctor: Domain Adaptation for Low-Resource Neural Machine Translation
From Priest to Doctor: Domain Adaptation for Low-Resource Neural Machine Translation
Ali Marashian
Enora Rice
Luke Gessler
Alexis Palmer
K. Wense
81
1
0
24 Feb 2025
Language and Task Arithmetic with Parameter-Efficient Layers for
  Zero-Shot Summarization
Language and Task Arithmetic with Parameter-Efficient Layers for Zero-Shot Summarization
Alexandra Chronopoulou
Jonas Pfeiffer
Joshua Maynez
Xinyi Wang
Sebastian Ruder
Priyanka Agrawal
MoMe
26
14
0
15 Nov 2023
Extrapolating Multilingual Understanding Models as Multilingual
  Generators
Extrapolating Multilingual Understanding Models as Multilingual Generators
Bohong Wu
Fei Yuan
Hai Zhao
Lei Li
Jingjing Xu
AI4CE
25
2
0
22 May 2023
$m^4Adapter$: Multilingual Multi-Domain Adaptation for Machine
  Translation with a Meta-Adapter
m4Adapterm^4Adapterm4Adapter: Multilingual Multi-Domain Adaptation for Machine Translation with a Meta-Adapter
Wen Lai
Alexandra Chronopoulou
Alexander Fraser
24
3
0
21 Oct 2022
Pretrained Models for Multilingual Federated Learning
Pretrained Models for Multilingual Federated Learning
Orion Weller
Marc Marone
Vladimir Braverman
Dawn J Lawrie
Benjamin Van Durme
VLM
FedML
AI4CE
35
42
0
06 Jun 2022
When does Parameter-Efficient Transfer Learning Work for Machine
  Translation?
When does Parameter-Efficient Transfer Learning Work for Machine Translation?
Ahmet Üstün
Asa Cooper Stickland
37
7
0
23 May 2022
Efficient Hierarchical Domain Adaptation for Pretrained Language Models
Efficient Hierarchical Domain Adaptation for Pretrained Language Models
Alexandra Chronopoulou
Matthew E. Peters
Jesse Dodge
33
42
0
16 Dec 2021
Recent Advances in Natural Language Processing via Large Pre-Trained
  Language Models: A Survey
Recent Advances in Natural Language Processing via Large Pre-Trained Language Models: A Survey
Bonan Min
Hayley L Ross
Elior Sulem
Amir Pouran Ben Veyseh
Thien Huu Nguyen
Oscar Sainz
Eneko Agirre
Ilana Heinz
Dan Roth
LM&MA
VLM
AI4CE
83
1,035
0
01 Nov 2021
Neural Machine Translation for Low-Resource Languages: A Survey
Neural Machine Translation for Low-Resource Languages: A Survey
Surangika Ranathunga
E. Lee
Marjana Prifti Skenduli
Ravi Shekhar
Mehreen Alam
Rishemjit Kaur
38
236
0
29 Jun 2021
Prefix-Tuning: Optimizing Continuous Prompts for Generation
Prefix-Tuning: Optimizing Continuous Prompts for Generation
Xiang Lisa Li
Percy Liang
20
4,088
0
01 Jan 2021
Multilingual Speech Translation with Efficient Finetuning of Pretrained
  Models
Multilingual Speech Translation with Efficient Finetuning of Pretrained Models
Xian Li
Changhan Wang
Yun Tang
C. Tran
Yuqing Tang
J. Pino
Alexei Baevski
Alexis Conneau
Michael Auli
21
6
0
24 Oct 2020
Multilingual Neural Machine Translation with Language Clustering
Multilingual Neural Machine Translation with Language Clustering
Xu Tan
Jiale Chen
Di He
Yingce Xia
Tao Qin
Tie-Yan Liu
175
110
0
25 Aug 2019
Six Challenges for Neural Machine Translation
Six Challenges for Neural Machine Translation
Philipp Koehn
Rebecca Knowles
AAML
AIMat
224
1,208
0
12 Jun 2017
Multi-Way, Multilingual Neural Machine Translation with a Shared
  Attention Mechanism
Multi-Way, Multilingual Neural Machine Translation with a Shared Attention Mechanism
Orhan Firat
Kyunghyun Cho
Yoshua Bengio
LRM
AIMat
231
623
0
06 Jan 2016
1