ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.04642
38
0

A Novel Algorithm for Personalized Federated Learning: Knowledge Distillation with Weighted Combination Loss

6 April 2025
Hengrui Hu
Anai N. Kothari
Anjishnu Banerjee
    FedML
ArXivPDFHTML
Abstract

Federated learning (FL) offers a privacy-preserving framework for distributed machine learning, enabling collaborative model training across diverse clients without centralizing sensitive data. However, statistical heterogeneity, characterized by non-independent and identically distributed (non-IID) client data, poses significant challenges, leading to model drift and poor generalization. This paper proposes a novel algorithm, pFedKD-WCL (Personalized Federated Knowledge Distillation with Weighted Combination Loss), which integrates knowledge distillation with bi-level optimization to address non-IID challenges. pFedKD-WCL leverages the current global model as a teacher to guide local models, optimizing both global convergence and local personalization efficiently. We evaluate pFedKD-WCL on the MNIST dataset and a synthetic dataset with non-IID partitioning, using multinomial logistic regression and multilayer perceptron models. Experimental results demonstrate that pFedKD-WCL outperforms state-of-the-art algorithms, including FedAvg, FedProx, Per-FedAvg, and pFedMe, in terms of accuracy and convergence speed.

View on arXiv
@article{hu2025_2504.04642,
  title={ A Novel Algorithm for Personalized Federated Learning: Knowledge Distillation with Weighted Combination Loss },
  author={ Hengrui Hu and Anai N. Kothari and Anjishnu Banerjee },
  journal={arXiv preprint arXiv:2504.04642},
  year={ 2025 }
}
Comments on this paper