ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2409.13931
42
1

On-Device Collaborative Language Modeling via a Mixture of Generalists and Specialists

20 September 2024
Dongyang Fan
Bettina Messmer
N. Doikov
Martin Jaggi
    MoMe
    MoE
ArXivPDFHTML
Abstract

On-device LLMs have gained increasing attention for their ability to enhance privacy and provide a personalized user experience. To facilitate private learning with scarce data, Federated Learning has become a standard approach. However, it faces challenges such as computational resource heterogeneity and data heterogeneity among end users. We propose CoMiGS (Co\textbf{Co}Collaborative learning with a Mi\textbf{Mi}Mixture of G\textbf{G}Generalists and S\textbf{S}Specialists), the first approach to address both challenges. A key innovation of our method is the bi-level optimization formulation of the Mixture-of-Experts learning objective, where the router is optimized using a separate validation set to ensure alignment with the target distribution. We solve our objective with alternating minimization, for which we provide a theoretical analysis. Our method shares generalist experts across users while localizing a varying number of specialist experts, thereby adapting to users' computational resources and preserving privacy. Through extensive experiments, we show CoMiGS effectively balances general and personalized knowledge for each token generation. We demonstrate that CoMiGS remains robust against overfitting-due to the generalists' regularizing effect-while adapting to local data through specialist expertise. We open source our codebase for collaborative LLMs.

View on arXiv
@article{fan2025_2409.13931,
  title={ On-Device Collaborative Language Modeling via a Mixture of Generalists and Specialists },
  author={ Dongyang Fan and Bettina Messmer and Nikita Doikov and Martin Jaggi },
  journal={arXiv preprint arXiv:2409.13931},
  year={ 2025 }
}
Comments on this paper