ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.08586
  4. Cited By
FedJETs: Efficient Just-In-Time Personalization with Federated Mixture
  of Experts

FedJETs: Efficient Just-In-Time Personalization with Federated Mixture of Experts

14 June 2023
Chen Dun
Mirian Hipolito Garcia
Guoqing Zheng
Ahmed Hassan Awadallah
Robert Sim
Anastasios Kyrillidis
Dimitrios Dimitriadis
    FedML
    MoE
ArXivPDFHTML

Papers citing "FedJETs: Efficient Just-In-Time Personalization with Federated Mixture of Experts"

7 / 7 papers shown
Title
Token-Level Prompt Mixture with Parameter-Free Routing for Federated Domain Generalization
Token-Level Prompt Mixture with Parameter-Free Routing for Federated Domain Generalization
Shuai Gong
C. Cui
Xiaolin Dong
Xiushan Nie
Lei Zhu
Xiaojun Chang
FedML
MoE
64
0
0
29 Apr 2025
A Comprehensive Survey of Mixture-of-Experts: Algorithms, Theory, and Applications
A Comprehensive Survey of Mixture-of-Experts: Algorithms, Theory, and Applications
Siyuan Mu
Sen Lin
MoE
153
2
0
10 Mar 2025
Personalized Federated Fine-Tuning for LLMs via Data-Driven Heterogeneous Model Architectures
Personalized Federated Fine-Tuning for LLMs via Data-Driven Heterogeneous Model Architectures
Yicheng Zhang
Zhen Qin
Zhaomin Wu
Jian Hou
Shuiguang Deng
80
2
0
28 Nov 2024
FedMoE-DA: Federated Mixture of Experts via Domain Aware Fine-grained
  Aggregation
FedMoE-DA: Federated Mixture of Experts via Domain Aware Fine-grained Aggregation
Ziwei Zhan
Wenkuan Zhao
Yuanqing Li
Weijie Liu
Xiaoxi Zhang
Chee Wei Tan
Chuan Wu
Deke Guo
Xu Chen
MoE
48
1
0
04 Nov 2024
Mixture of Experts Made Personalized: Federated Prompt Learning for Vision-Language Models
Mixture of Experts Made Personalized: Federated Prompt Learning for Vision-Language Models
Jun Luo
Chong Chen
Shandong Wu
FedML
VLM
MoE
52
3
0
14 Oct 2024
pFedMoE: Data-Level Personalization with Mixture of Experts for
  Model-Heterogeneous Personalized Federated Learning
pFedMoE: Data-Level Personalization with Mixture of Experts for Model-Heterogeneous Personalized Federated Learning
Liping Yi
Han Yu
Chao Ren
Heng-Ming Zhang
Gang Wang
Xiaoguang Liu
Xiaoxiao Li
MoE
31
8
0
02 Feb 2024
Federated Learning on Non-IID Data Silos: An Experimental Study
Federated Learning on Non-IID Data Silos: An Experimental Study
Yue Liu
Yiqun Diao
Quan Chen
Bingsheng He
FedML
OOD
101
950
0
03 Feb 2021
1