Papers
Communities
Organizations
Events
Blog
Pricing
Search
Open menu
Home
Papers
2306.05406
Cited By
Mixture-of-Domain-Adapters: Decoupling and Injecting Domain Knowledge to Pre-trained Language Models Memories
8 June 2023
Shizhe Diao
Tianyang Xu
Ruijia Xu
Jiawei Wang
Tong Zhang
MoE
AI4CE
Re-assign community
ArXiv (abs)
PDF
HTML
Github (48★)
Papers citing
"Mixture-of-Domain-Adapters: Decoupling and Injecting Domain Knowledge to Pre-trained Language Models Memories"
7 / 7 papers shown
Title
Pastiche Novel Generation Creating: Fan Fiction You Love in Your Favorite Author's Style
Xueran Han
Yuhan Liu
Mingzhe Li
Wen Liu
Sen Hu
Rui Yan
Zhiqiang Xu
Preslav Nakov
113
0
0
24 Feb 2025
Ensembles of Low-Rank Expert Adapters
Yinghao Li
Vianne Gao
Chao Zhang
MohamadAli Torkamani
180
0
0
31 Jan 2025
Glider: Global and Local Instruction-Driven Expert Router
Pingzhi Li
Prateek Yadav
Jaehong Yoon
Jie Peng
Yi-Lin Sung
Joey Tianyi Zhou
Tianlong Chen
MoMe
MoE
99
2
0
09 Oct 2024
Multi-Task Domain Adaptation for Language Grounding with 3D Objects
Penglei Sun
Yaoxian Song
Xinglin Pan
Peijie Dong
Xiaofei Yang
Qiang-qiang Wang
Zhixu Li
Tiefeng Li
Xiaowen Chu
130
1
0
03 Jul 2024
Personalized LLM Response Generation with Parameterized Memory Injection
Kai Zhang
Lizhi Qing
Yangyang Kang
127
11
0
04 Apr 2024
Human Centered AI for Indian Legal Text Analytics
Sudipto Ghosh
Devanshu Verma
Balaji Ganesan
Purnima Bindal
Vikas Kumar
Vasudha Bhatnagar
94
1
0
16 Mar 2024
Foundation Model Sherpas: Guiding Foundation Models through Knowledge and Reasoning
D. Bhattacharjya
Junkyu Lee
Don Joven Agravante
Balaji Ganesan
Radu Marinescu
LLMAG
58
1
0
02 Feb 2024
1