ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2501.10062
  4. Cited By
OMoE: Diversifying Mixture of Low-Rank Adaptation by Orthogonal Finetuning

OMoE: Diversifying Mixture of Low-Rank Adaptation by Orthogonal Finetuning

20 January 2025
Jinyuan Feng
Zhiqiang Pu
Tianyi Hu
Dongmin Li
Xiaolin Ai
Huimu Wang
    MoE
ArXivPDFHTML

Papers citing "OMoE: Diversifying Mixture of Low-Rank Adaptation by Orthogonal Finetuning"

4 / 4 papers shown
Title
Two Is Better Than One: Rotations Scale LoRAs
Two Is Better Than One: Rotations Scale LoRAs
Hongcan Guo
Guoshun Nan
Yuan Yang
Diyang Zhang
Haotian Li
...
Yuhan Ran
Xinye Cao
Sicong Leng
Xiaofeng Tao
Xudong Jiang
17
0
0
29 May 2025
CoMoE: Contrastive Representation for Mixture-of-Experts in Parameter-Efficient Fine-tuning
Jinyuan Feng
Chaopeng Wei
Tenghai Qiu
Tianyi Hu
Zhiqiang Pu
MoE
55
0
0
23 May 2025
FT-MoE: Sustainable-learning Mixture of Experts Model for Fault-Tolerant Computing with Multiple Tasks
FT-MoE: Sustainable-learning Mixture of Experts Model for Fault-Tolerant Computing with Multiple Tasks
Wenjing Xiao
Wenhao Song
Miaojiang Chen
Ruikun Luo
Min Chen
MoE
376
0
0
29 Apr 2025
Mixture of Group Experts for Learning Invariant Representations
Mixture of Group Experts for Learning Invariant Representations
Lei Kang
Jia Li
Mi Tian
Hua Huang
MoE
89
0
0
12 Apr 2025
1