ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.13408
  4. Cited By
MoR: Mixture of Ranks for Low-Rank Adaptation Tuning
v1v2 (latest)

MoR: Mixture of Ranks for Low-Rank Adaptation Tuning

17 October 2024
Chuanyu Tang
Yilong Chen
Zhenyu Zhang
Junyuan Shang
Wenyuan Zhang
Yong Huang
Tingwen Liu
    MoE
ArXiv (abs)PDFHTML

Papers citing "MoR: Mixture of Ranks for Low-Rank Adaptation Tuning"

4 / 4 papers shown
Title
Mixture-of-Subspaces in Low-Rank Adaptation
Mixture-of-Subspaces in Low-Rank Adaptation
Taiqiang Wu
Jiahao Wang
Zhe Zhao
Ngai Wong
85
27
0
16 Jun 2024
LoRA Dropout as a Sparsity Regularizer for Overfitting Control
LoRA Dropout as a Sparsity Regularizer for Overfitting Control
Yang Lin
Xinyu Ma
Xu Chu
Yujie Jin
Zhibang Yang
Yasha Wang
Hong-yan Mei
82
25
0
15 Apr 2024
LoRA: Low-Rank Adaptation of Large Language Models
LoRA: Low-Rank Adaptation of Large Language Models
J. E. Hu
Yelong Shen
Phillip Wallis
Zeyuan Allen-Zhu
Yuanzhi Li
Shean Wang
Lu Wang
Weizhu Chen
OffRLAI4TSAI4CEALMAIMat
477
10,367
0
17 Jun 2021
Intrinsic Dimensionality Explains the Effectiveness of Language Model
  Fine-Tuning
Intrinsic Dimensionality Explains the Effectiveness of Language Model Fine-Tuning
Armen Aghajanyan
Luke Zettlemoyer
Sonal Gupta
101
568
1
22 Dec 2020
1