ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2005.00052
  4. Cited By
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer

MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer

30 April 2020
Jonas Pfeiffer
Ivan Vulić
Iryna Gurevych
Sebastian Ruder
ArXivPDFHTML

Papers citing "MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer"

40 / 140 papers shown
Title
Efficient Hierarchical Domain Adaptation for Pretrained Language Models
Efficient Hierarchical Domain Adaptation for Pretrained Language Models
Alexandra Chronopoulou
Matthew E. Peters
Jesse Dodge
33
42
0
16 Dec 2021
Recent Advances in Natural Language Processing via Large Pre-Trained
  Language Models: A Survey
Recent Advances in Natural Language Processing via Large Pre-Trained Language Models: A Survey
Bonan Min
Hayley L Ross
Elior Sulem
Amir Pouran Ben Veyseh
Thien Huu Nguyen
Oscar Sainz
Eneko Agirre
Ilana Heinz
Dan Roth
LM&MA
VLM
AI4CE
83
1,035
0
01 Nov 2021
Can Character-based Language Models Improve Downstream Task Performance
  in Low-Resource and Noisy Language Scenarios?
Can Character-based Language Models Improve Downstream Task Performance in Low-Resource and Noisy Language Scenarios?
Arij Riabi
Benoît Sagot
Djamé Seddah
31
15
0
26 Oct 2021
DS-TOD: Efficient Domain Specialization for Task Oriented Dialog
DS-TOD: Efficient Domain Specialization for Task Oriented Dialog
Chia-Chien Hung
Anne Lauscher
Simone Paolo Ponzetto
Goran Glavas
36
31
0
15 Oct 2021
Few-shot Controllable Style Transfer for Low-Resource Multilingual
  Settings
Few-shot Controllable Style Transfer for Low-Resource Multilingual Settings
Kalpesh Krishna
Deepak Nathani
Xavier Garcia
Bidisha Samanta
Partha P. Talukdar
40
24
0
14 Oct 2021
Language Models are Few-shot Multilingual Learners
Language Models are Few-shot Multilingual Learners
Genta Indra Winata
Andrea Madotto
Zhaojiang Lin
Rosanne Liu
J. Yosinski
Pascale Fung
ELM
LRM
36
132
0
16 Sep 2021
xGQA: Cross-Lingual Visual Question Answering
xGQA: Cross-Lingual Visual Question Answering
Jonas Pfeiffer
Gregor Geigle
Aishwarya Kamath
Jan-Martin O. Steitz
Stefan Roth
Ivan Vulić
Iryna Gurevych
37
56
0
13 Sep 2021
Raise a Child in Large Language Model: Towards Effective and
  Generalizable Fine-tuning
Raise a Child in Large Language Model: Towards Effective and Generalizable Fine-tuning
Runxin Xu
Fuli Luo
Zhiyuan Zhang
Chuanqi Tan
Baobao Chang
Songfang Huang
Fei Huang
LRM
151
178
0
13 Sep 2021
Efficient Test Time Adapter Ensembling for Low-resource Language
  Varieties
Efficient Test Time Adapter Ensembling for Low-resource Language Varieties
Xinyi Wang
Yulia Tsvetkov
Sebastian Ruder
Graham Neubig
30
34
0
10 Sep 2021
MetaXT: Meta Cross-Task Transfer between Disparate Label Spaces
MetaXT: Meta Cross-Task Transfer between Disparate Label Spaces
Srinagesh Sharma
Guoqing Zheng
Ahmed Hassan Awadallah
27
1
0
09 Sep 2021
Sustainable Modular Debiasing of Language Models
Sustainable Modular Debiasing of Language Models
Anne Lauscher
Tobias Lüken
Goran Glavas
55
120
0
08 Sep 2021
Nearest Neighbour Few-Shot Learning for Cross-lingual Classification
Nearest Neighbour Few-Shot Learning for Cross-lingual Classification
M Saiful Bari
Batool Haider
Saab Mansour
VLM
16
13
0
06 Sep 2021
MultiEURLEX -- A multi-lingual and multi-label legal document
  classification dataset for zero-shot cross-lingual transfer
MultiEURLEX -- A multi-lingual and multi-label legal document classification dataset for zero-shot cross-lingual transfer
Ilias Chalkidis
Manos Fergadiotis
Ion Androutsopoulos
AILaw
27
107
0
02 Sep 2021
Design and Scaffolded Training of an Efficient DNN Operator for Computer
  Vision on the Edge
Design and Scaffolded Training of an Efficient DNN Operator for Computer Vision on the Edge
Vinod Ganesan
Pratyush Kumar
42
2
0
25 Aug 2021
An Adapter Based Pre-Training for Efficient and Scalable Self-Supervised
  Speech Representation Learning
An Adapter Based Pre-Training for Efficient and Scalable Self-Supervised Speech Representation Learning
Samuel Kessler
Bethan Thomas
S. Karout
SSL
19
29
0
26 Jul 2021
Target-Oriented Fine-tuning for Zero-Resource Named Entity Recognition
Target-Oriented Fine-tuning for Zero-Resource Named Entity Recognition
Ying Zhang
Fandong Meng
Jinan Xu
Jinan Xu
Jie Zhou
30
10
0
22 Jul 2021
A Primer on Pretrained Multilingual Language Models
A Primer on Pretrained Multilingual Language Models
Sumanth Doddapaneni
Gowtham Ramesh
Mitesh M. Khapra
Anoop Kunchukuttan
Pratyush Kumar
LRM
43
74
0
01 Jul 2021
Specializing Multilingual Language Models: An Empirical Study
Specializing Multilingual Language Models: An Empirical Study
Ethan C. Chau
Noah A. Smith
27
27
0
16 Jun 2021
Reinforced Iterative Knowledge Distillation for Cross-Lingual Named
  Entity Recognition
Reinforced Iterative Knowledge Distillation for Cross-Lingual Named Entity Recognition
Shining Liang
Ming Gong
J. Pei
Linjun Shou
Wanli Zuo
Xianglin Zuo
Daxin Jiang
31
34
0
01 Jun 2021
Crossing the Conversational Chasm: A Primer on Natural Language
  Processing for Multilingual Task-Oriented Dialogue Systems
Crossing the Conversational Chasm: A Primer on Natural Language Processing for Multilingual Task-Oriented Dialogue Systems
E. Razumovskaia
Goran Glavavs
Olga Majewska
E. Ponti
Anna Korhonen
Ivan Vulić
23
32
0
17 Apr 2021
XTREME-R: Towards More Challenging and Nuanced Multilingual Evaluation
XTREME-R: Towards More Challenging and Nuanced Multilingual Evaluation
Sebastian Ruder
Noah Constant
Jan A. Botha
Aditya Siddhant
Orhan Firat
...
Pengfei Liu
Junjie Hu
Dan Garrette
Graham Neubig
Melvin Johnson
ELM
AAML
LRM
24
184
0
15 Apr 2021
Improving the Lexical Ability of Pretrained Language Models for
  Unsupervised Neural Machine Translation
Improving the Lexical Ability of Pretrained Language Models for Unsupervised Neural Machine Translation
Alexandra Chronopoulou
Dario Stojanovski
Alexander Fraser
SSL
37
26
0
18 Mar 2021
Structural Adapters in Pretrained Language Models for AMR-to-text
  Generation
Structural Adapters in Pretrained Language Models for AMR-to-text Generation
Leonardo F. R. Ribeiro
Yue Zhang
Iryna Gurevych
41
69
0
16 Mar 2021
Multilingual Multimodal Pre-training for Zero-Shot Cross-Lingual
  Transfer of Vision-Language Models
Multilingual Multimodal Pre-training for Zero-Shot Cross-Lingual Transfer of Vision-Language Models
Po-Yao (Bernie) Huang
Mandela Patrick
Junjie Hu
Graham Neubig
Florian Metze
Alexander G. Hauptmann
MLLM
VLM
24
56
0
16 Mar 2021
CANINE: Pre-training an Efficient Tokenization-Free Encoder for Language
  Representation
CANINE: Pre-training an Efficient Tokenization-Free Encoder for Language Representation
J. Clark
Dan Garrette
Iulia Turc
John Wieting
36
210
0
11 Mar 2021
PADA: Example-based Prompt Learning for on-the-fly Adaptation to Unseen
  Domains
PADA: Example-based Prompt Learning for on-the-fly Adaptation to Unseen Domains
Eyal Ben-David
Nadav Oved
Roi Reichart
VLM
OOD
17
88
0
24 Feb 2021
Trankit: A Light-Weight Transformer-based Toolkit for Multilingual
  Natural Language Processing
Trankit: A Light-Weight Transformer-based Toolkit for Multilingual Natural Language Processing
Minh Nguyen
Viet Dac Lai
Amir Pouran Ben Veyseh
Thien Huu Nguyen
52
132
0
09 Jan 2021
Learning to Generate Task-Specific Adapters from Task Description
Learning to Generate Task-Specific Adapters from Task Description
Qinyuan Ye
Xiang Ren
115
29
0
02 Jan 2021
UNKs Everywhere: Adapting Multilingual Language Models to New Scripts
UNKs Everywhere: Adapting Multilingual Language Models to New Scripts
Jonas Pfeiffer
Ivan Vulić
Iryna Gurevych
Sebastian Ruder
24
126
0
31 Dec 2020
Parameter-Efficient Transfer Learning with Diff Pruning
Parameter-Efficient Transfer Learning with Diff Pruning
Demi Guo
Alexander M. Rush
Yoon Kim
13
385
0
14 Dec 2020
Orthogonal Language and Task Adapters in Zero-Shot Cross-Lingual
  Transfer
Orthogonal Language and Task Adapters in Zero-Shot Cross-Lingual Transfer
M. Vidoni
Ivan Vulić
Goran Glavas
33
27
0
11 Dec 2020
Emergent Communication Pretraining for Few-Shot Machine Translation
Emergent Communication Pretraining for Few-Shot Machine Translation
Yaoyiran Li
E. Ponti
Ivan Vulić
Anna Korhonen
25
19
0
02 Nov 2020
Rethinking embedding coupling in pre-trained language models
Rethinking embedding coupling in pre-trained language models
Hyung Won Chung
Thibault Févry
Henry Tsai
Melvin Johnson
Sebastian Ruder
95
142
0
24 Oct 2020
AdapterDrop: On the Efficiency of Adapters in Transformers
AdapterDrop: On the Efficiency of Adapters in Transformers
Andreas Rucklé
Gregor Geigle
Max Glockner
Tilman Beck
Jonas Pfeiffer
Nils Reimers
Iryna Gurevych
57
254
0
22 Oct 2020
Model Selection for Cross-Lingual Transfer
Model Selection for Cross-Lingual Transfer
Yang Chen
Alan Ritter
15
12
0
13 Oct 2020
Probing Pretrained Language Models for Lexical Semantics
Probing Pretrained Language Models for Lexical Semantics
Ivan Vulić
E. Ponti
Robert Litschko
Goran Glavas
Anna Korhonen
KELM
33
233
0
12 Oct 2020
Efficient Transformers: A Survey
Efficient Transformers: A Survey
Yi Tay
Mostafa Dehghani
Dara Bahri
Donald Metzler
VLM
109
1,102
0
14 Sep 2020
AdapterFusion: Non-Destructive Task Composition for Transfer Learning
AdapterFusion: Non-Destructive Task Composition for Transfer Learning
Jonas Pfeiffer
Aishwarya Kamath
Andreas Rucklé
Kyunghyun Cho
Iryna Gurevych
CLL
MoMe
21
817
0
01 May 2020
UDapter: Language Adaptation for Truly Universal Dependency Parsing
UDapter: Language Adaptation for Truly Universal Dependency Parsing
Ahmet Üstün
Arianna Bisazza
G. Bouma
Gertjan van Noord
27
113
0
29 Apr 2020
What the [MASK]? Making Sense of Language-Specific BERT Models
What the [MASK]? Making Sense of Language-Specific BERT Models
Debora Nozza
Federico Bianchi
Dirk Hovy
84
105
0
05 Mar 2020
Previous
123