Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2501.10062
Cited By
OMoE: Diversifying Mixture of Low-Rank Adaptation by Orthogonal Finetuning
20 January 2025
Jinyuan Feng
Zhiqiang Pu
Tianyi Hu
Dongmin Li
Xiaolin Ai
Huimu Wang
MoE
Re-assign community
ArXiv
PDF
HTML
Papers citing
"OMoE: Diversifying Mixture of Low-Rank Adaptation by Orthogonal Finetuning"
4 / 4 papers shown
Title
Two Is Better Than One: Rotations Scale LoRAs
Hongcan Guo
Guoshun Nan
Yuan Yang
Diyang Zhang
Haotian Li
...
Yuhan Ran
Xinye Cao
Sicong Leng
Xiaofeng Tao
Xudong Jiang
17
0
0
29 May 2025
CoMoE: Contrastive Representation for Mixture-of-Experts in Parameter-Efficient Fine-tuning
Jinyuan Feng
Chaopeng Wei
Tenghai Qiu
Tianyi Hu
Zhiqiang Pu
MoE
55
0
0
23 May 2025
FT-MoE: Sustainable-learning Mixture of Experts Model for Fault-Tolerant Computing with Multiple Tasks
Wenjing Xiao
Wenhao Song
Miaojiang Chen
Ruikun Luo
Min Chen
MoE
376
0
0
29 Apr 2025
Mixture of Group Experts for Learning Invariant Representations
Lei Kang
Jia Li
Mi Tian
Hua Huang
MoE
89
0
0
12 Apr 2025
1