
LoRAMoE: Alleviate World Knowledge Forgetting in Large Language Models via MoE-Style Plugin
Shihan Dou
Enyu Zhou
Yan Liu
Songyang Gao
Jun Zhao
Wei Shen
Yuhao Zhou
Zhiheng Xi
Xiao Wang
Xiaoran Fan
Shiliang Pu
Jiang Zhu
Rui Zheng
Tao Gui
Qi Zhang
Xuanjing Huang
Papers citing "LoRAMoE: Alleviate World Knowledge Forgetting in Large Language Models via MoE-Style Plugin"
16 / 16 papers shown