ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2501.09522
  4. Cited By
Merging Models on the Fly Without Retraining: A Sequential Approach to Scalable Continual Model Merging

Merging Models on the Fly Without Retraining: A Sequential Approach to Scalable Continual Model Merging

17 January 2025
Anke Tang
Enneng Yang
Li Shen
Yong Luo
Han Hu
Bo Du
Dacheng Tao
    MoMe
    CLL
ArXivPDFHTML

Papers citing "Merging Models on the Fly Without Retraining: A Sequential Approach to Scalable Continual Model Merging"

3 / 3 papers shown
Title
MINGLE: Mixtures of Null-Space Gated Low-Rank Experts for Test-Time Continual Model Merging
MINGLE: Mixtures of Null-Space Gated Low-Rank Experts for Test-Time Continual Model Merging
Zihuan Qiu
Yi Xu
Chiyuan He
Fanman Meng
Linfeng Xu
Qi Wu
Hongliang Li
CLL
MoMe
76
0
0
17 May 2025
Parameter-Efficient Continual Fine-Tuning: A Survey
Parameter-Efficient Continual Fine-Tuning: A Survey
Eric Nuertey Coleman
Luigi Quarantiello
Ziyue Liu
Qinwen Yang
Samrat Mukherjee
J. Hurtado
Vincenzo Lomonaco
CLL
70
0
0
18 Apr 2025
Scalable Model Merging with Progressive Layer-wise Distillation
Scalable Model Merging with Progressive Layer-wise Distillation
Jing Xu
Jiazheng Li
J.N. Zhang
MoMe
FedML
241
2
0
18 Feb 2025
1