Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2005.00247
Cited By
v1
v2
v3 (latest)
AdapterFusion: Non-Destructive Task Composition for Transfer Learning
1 May 2020
Jonas Pfeiffer
Aishwarya Kamath
Andreas Rucklé
Kyunghyun Cho
Iryna Gurevych
CLL
MoMe
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"AdapterFusion: Non-Destructive Task Composition for Transfer Learning"
50 / 553 papers shown
Title
Achieving Forgetting Prevention and Knowledge Transfer in Continual Learning
Zixuan Ke
Bing-Quan Liu
Nianzu Ma
Hu Xu
Lei Shu
CLL
227
127
0
05 Dec 2021
Many Heads but One Brain: Fusion Brain -- a Competition and a Single Multimodal Multitask Architecture
Daria Bakshandaeva
Denis Dimitrov
V.Ya. Arkhipkin
Alex Shonenkov
M. Potanin
...
Mikhail Martynov
Anton Voronov
Vera Davydova
E. Tutubalina
Aleksandr Petiushko
110
0
0
22 Nov 2021
Leveraging Sentiment Analysis Knowledge to Solve Emotion Detection Tasks
Maude Nguyen-The
Guillaume-Alexandre Bilodeau
Jan Rockemann
64
4
0
05 Nov 2021
Unsupervised Domain Adaptation with Adapter
Rongsheng Zhang
Yinhe Zheng
Xiaoxi Mao
Minlie Huang
51
18
0
01 Nov 2021
Multilingual Domain Adaptation for NMT: Decoupling Language and Domain Information with Adapters
Asa Cooper Stickland
Alexandre Berard
Vassilina Nikoulina
AI4CE
67
29
0
18 Oct 2021
Lifelong Pretraining: Continually Adapting Language Models to Emerging Corpora
Xisen Jin
Dejiao Zhang
Henghui Zhu
Wei Xiao
Shang-Wen Li
Xiaokai Wei
Andrew O. Arnold
Xiang Ren
KELM
CLL
126
117
0
16 Oct 2021
DS-TOD: Efficient Domain Specialization for Task Oriented Dialog
Chia-Chien Hung
Anne Lauscher
Simone Paolo Ponzetto
Goran Glavaš
99
31
0
15 Oct 2021
UniPELT: A Unified Framework for Parameter-Efficient Language Model Tuning
Yuning Mao
Lambert Mathias
Rui Hou
Amjad Almahairi
Hao Ma
Jiawei Han
Wen-tau Yih
Madian Khabsa
107
198
0
14 Oct 2021
Composable Sparse Fine-Tuning for Cross-Lingual Transfer
Alan Ansell
Edoardo Ponti
Anna Korhonen
Ivan Vulić
CLL
MoE
159
143
0
14 Oct 2021
LFPT5: A Unified Framework for Lifelong Few-shot Language Learning Based on Prompt Tuning of T5
Chengwei Qin
Shafiq Joty
CLL
216
104
0
14 Oct 2021
Differentially Private Fine-tuning of Language Models
Da Yu
Saurabh Naik
A. Backurs
Sivakanth Gopi
Huseyin A. Inan
...
Y. Lee
Andre Manoel
Lukas Wutschitz
Sergey Yekhanin
Huishuai Zhang
268
373
0
13 Oct 2021
Towards a Unified View of Parameter-Efficient Transfer Learning
Junxian He
Chunting Zhou
Xuezhe Ma
Taylor Berg-Kirkpatrick
Graham Neubig
AAML
217
959
0
08 Oct 2021
Generalization in NLI: Ways (Not) To Go Beyond Simple Heuristics
Prajjwal Bhargava
Aleksandr Drozd
Anna Rogers
169
108
0
04 Oct 2021
Single-dataset Experts for Multi-dataset Question Answering
Dan Friedman
Ben Dodge
Danqi Chen
MoMe
180
26
0
28 Sep 2021
The Trade-offs of Domain Adaptation for Neural Language Models
David Grangier
Dan Iter
84
22
0
21 Sep 2021
xGQA: Cross-Lingual Visual Question Answering
Jonas Pfeiffer
Gregor Geigle
Aishwarya Kamath
Jan-Martin O. Steitz
Stefan Roth
Ivan Vulić
Iryna Gurevych
117
62
0
13 Sep 2021
Total Recall: a Customized Continual Learning Method for Neural Semantic Parsers
Zhuang Li
Zhuang Li
Gholamreza Haffari
CLL
88
15
0
11 Sep 2021
Efficient Test Time Adapter Ensembling for Low-resource Language Varieties
Xinyi Wang
Yulia Tsvetkov
Sebastian Ruder
Graham Neubig
82
35
0
10 Sep 2021
Mixture-of-Partitions: Infusing Large Biomedical Knowledge Graphs into BERT
Zaiqiao Meng
Fangyu Liu
T. H. Clark
Ehsan Shareghi
Nigel Collier
MoE
88
37
0
10 Sep 2021
MetaXT: Meta Cross-Task Transfer between Disparate Label Spaces
Srinagesh Sharma
Guoqing Zheng
Ahmed Hassan Awadallah
50
1
0
09 Sep 2021
Sustainable Modular Debiasing of Language Models
Anne Lauscher
Tobias Lüken
Goran Glavaš
150
124
0
08 Sep 2021
MultiEURLEX -- A multi-lingual and multi-label legal document classification dataset for zero-shot cross-lingual transfer
Ilias Chalkidis
Manos Fergadiotis
Ion Androutsopoulos
AILaw
118
111
0
02 Sep 2021
AdapterHub Playground: Simple and Flexible Few-Shot Learning with Adapters
Tilman Beck
Bela Bohlender
Christina Viehmann
Vincent Hane
Yanik Adamson
Jaber Khuri
Jonas Brossmann
Jonas Pfeiffer
Iryna Gurevych
76
16
0
18 Aug 2021
Robust Transfer Learning with Pretrained Language Models through Adapters
Wenjuan Han
Bo Pang
Ying Nian Wu
69
56
0
05 Aug 2021
An Adapter Based Pre-Training for Efficient and Scalable Self-Supervised Speech Representation Learning
Samuel Kessler
Bethan Thomas
S. Karout
SSL
81
30
0
26 Jul 2021
LoRA: Low-Rank Adaptation of Large Language Models
J. E. Hu
Yelong Shen
Phillip Wallis
Zeyuan Allen-Zhu
Yuanzhi Li
Shean Wang
Lu Wang
Weizhu Chen
OffRL
AI4TS
AI4CE
ALM
AIMat
860
10,661
0
17 Jun 2021
Specializing Multilingual Language Models: An Empirical Study
Ethan C. Chau
Noah A. Smith
139
27
0
16 Jun 2021
Neural Supervised Domain Adaptation by Augmenting Pre-trained Models with Random Units
Sara Meftah
N. Semmar
Y. Tamaazousti
H. Essafi
F. Sadat
67
3
0
09 Jun 2021
Compacter: Efficient Low-Rank Hypercomplex Adapter Layers
Rabeeh Karimi Mahabadi
James Henderson
Sebastian Ruder
MoE
157
495
0
08 Jun 2021
On the Effectiveness of Adapter-based Tuning for Pretrained Language Model Adaptation
Ruidan He
Linlin Liu
Hai Ye
Qingyu Tan
Bosheng Ding
Liying Cheng
Jia-Wei Low
Lidong Bing
Luo Si
65
205
0
06 Jun 2021
Lightweight Adapter Tuning for Multilingual Speech Translation
Hang Le
J. Pino
Changhan Wang
Jiatao Gu
D. Schwab
Laurent Besacier
147
90
0
02 Jun 2021
Exploiting Adapters for Cross-lingual Low-resource Speech Recognition
Wenxin Hou
Hanlin Zhu
Yidong Wang
Jindong Wang
Tao Qin
Renjun Xu
T. Shinozaki
71
65
0
18 May 2021
MineGAN++: Mining Generative Models for Efficient Knowledge Transfer to Limited Data Domains
Yaxing Wang
Abel Gonzalez-Garcia
Chenshen Wu
Luis Herranz
Fahad Shahbaz Khan
Shangling Jui
Joost van de Weijer
68
6
0
28 Apr 2021
XLM-T: Multilingual Language Models in Twitter for Sentiment Analysis and Beyond
Francesco Barbieri
Luis Espinosa Anke
Jose Camacho-Collados
264
228
0
25 Apr 2021
What to Pre-Train on? Efficient Intermediate Task Selection
Clifton A. Poth
Jonas Pfeiffer
Andreas Rucklé
Iryna Gurevych
113
100
0
16 Apr 2021
MetaXL: Meta Representation Transformation for Low-resource Cross-lingual Learning
Mengzhou Xia
Guoqing Zheng
Subhabrata Mukherjee
Milad Shokouhi
Graham Neubig
Ahmed Hassan Awadallah
82
32
0
16 Apr 2021
Continual Learning for Text Classification with Information Disentanglement Based Regularization
Yufan Huang
Yanzhe Zhang
Jiaao Chen
Xuezhi Wang
Diyi Yang
CLL
77
113
0
12 Apr 2021
Structural Adapters in Pretrained Language Models for AMR-to-text Generation
Leonardo F. R. Ribeiro
Yue Zhang
Iryna Gurevych
100
72
0
16 Mar 2021
Adapting MARBERT for Improved Arabic Dialect Identification: Submission to the NADI 2021 Shared Task
Badr AlKhamissi
Mohamed Gabr
Muhammad N. ElNokrashy
Khaled Essam
89
20
0
01 Mar 2021
Trankit: A Light-Weight Transformer-based Toolkit for Multilingual Natural Language Processing
Minh Nguyen
Viet Dac Lai
Amir Pouran Ben Veyseh
Thien Huu Nguyen
141
137
0
09 Jan 2021
Prefix-Tuning: Optimizing Continuous Prompts for Generation
Xiang Lisa Li
Percy Liang
260
4,340
0
01 Jan 2021
WARP: Word-level Adversarial ReProgramming
Karen Hambardzumyan
Hrant Khachatrian
Jonathan May
AAML
358
354
0
01 Jan 2021
How Good is Your Tokenizer? On the Monolingual Performance of Multilingual Language Models
Phillip Rust
Jonas Pfeiffer
Ivan Vulić
Sebastian Ruder
Iryna Gurevych
172
256
0
31 Dec 2020
UNKs Everywhere: Adapting Multilingual Language Models to New Scripts
Jonas Pfeiffer
Ivan Vulić
Iryna Gurevych
Sebastian Ruder
117
134
0
31 Dec 2020
Verb Knowledge Injection for Multilingual Event Processing
Olga Majewska
Ivan Vulić
Goran Glavaš
Edoardo Ponti
Anna Korhonen
85
11
0
31 Dec 2020
Parameter-Efficient Transfer Learning with Diff Pruning
Demi Guo
Alexander M. Rush
Yoon Kim
95
407
0
14 Dec 2020
Orthogonal Language and Task Adapters in Zero-Shot Cross-Lingual Transfer
M. Vidoni
Ivan Vulić
Goran Glavaš
105
27
0
11 Dec 2020
AdapterDrop: On the Efficiency of Adapters in Transformers
Andreas Rucklé
Gregor Geigle
Max Glockner
Tilman Beck
Jonas Pfeiffer
Nils Reimers
Iryna Gurevych
131
267
0
22 Oct 2020
MultiCQA: Zero-Shot Transfer of Self-Supervised Text Matching Models on a Massive Scale
Andreas Rucklé
Jonas Pfeiffer
Iryna Gurevych
83
38
0
02 Oct 2020
Conditionally Adaptive Multi-Task Learning: Improving Transfer Learning in NLP Using Fewer Parameters & Less Data
Jonathan Pilault
Amine Elhattami
C. Pal
CLL
MoE
94
92
0
19 Sep 2020
Previous
1
2
3
...
10
11
12
Next