Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2505.17553
Cited By
CoMoE: Contrastive Representation for Mixture-of-Experts in Parameter-Efficient Fine-tuning
23 May 2025
Jinyuan Feng
Chaopeng Wei
Tenghai Qiu
Tianyi Hu
Zhiqiang Pu
MoE
Re-assign community
ArXiv
PDF
HTML
Papers citing
"CoMoE: Contrastive Representation for Mixture-of-Experts in Parameter-Efficient Fine-tuning"
8 / 8 papers shown
Title
OMoE: Diversifying Mixture of Low-Rank Adaptation by Orthogonal Finetuning
Jinyuan Feng
Zhiqiang Pu
Tianyi Hu
Dongmin Li
Xiaolin Ai
Huimu Wang
MoE
41
4
0
20 Jan 2025
Scaling Large Language Model-based Multi-Agent Collaboration
Chen Qian
Zihao Xie
YiFei Wang
Wei Liu
Yufan Dang
...
Zhuoyun Du
Weize Chen
Cheng Yang
Zhiyuan Liu
Maosong Sun
AI4CE
LLMAG
LM&Ro
129
60
0
11 Jun 2024
LoRAMoE: Alleviate World Knowledge Forgetting in Large Language Models via MoE-Style Plugin
Shihan Dou
Enyu Zhou
Yan Liu
Songyang Gao
Jun Zhao
...
Jiang Zhu
Rui Zheng
Tao Gui
Qi Zhang
Xuanjing Huang
CLL
MoE
KELM
35
32
0
15 Dec 2023
Towards a Unified View of Parameter-Efficient Transfer Learning
Junxian He
Chunting Zhou
Xuezhe Ma
Taylor Berg-Kirkpatrick
Graham Neubig
AAML
81
918
0
08 Oct 2021
BitFit: Simple Parameter-efficient Fine-tuning for Transformer-based Masked Language-models
Elad Ben-Zaken
Shauli Ravfogel
Yoav Goldberg
143
1,191
0
18 Jun 2021
LoRA: Low-Rank Adaptation of Large Language Models
J. E. Hu
Yelong Shen
Phillip Wallis
Zeyuan Allen-Zhu
Yuanzhi Li
Shean Wang
Lu Wang
Weizhu Chen
OffRL
AI4TS
AI4CE
ALM
AIMat
235
10,099
0
17 Jun 2021
Prefix-Tuning: Optimizing Continuous Prompts for Generation
Xiang Lisa Li
Percy Liang
174
4,209
0
01 Jan 2021
BoolQ: Exploring the Surprising Difficulty of Natural Yes/No Questions
Christopher Clark
Kenton Lee
Ming-Wei Chang
Tom Kwiatkowski
Michael Collins
Kristina Toutanova
177
1,475
0
24 May 2019
1