34
0

Asynchronous Personalized Federated Learning through Global Memorization

Abstract

The proliferation of Internet of Things devices and advances in communication technology have unleashed an explosion of personal data, amplifying privacy concerns amid stringent regulations like GDPR and CCPA. Federated Learning offers a privacy preserving solution by enabling collaborative model training across decentralized devices without centralizing sensitive data. However, statistical heterogeneity from non-independent and identically distributed datasets and system heterogeneity due to client dropouts particularly those with monopolistic classes severely degrade the global model's performance. To address these challenges, we propose the Asynchronous Personalized Federated Learning framework, which empowers clients to develop personalized models using a server side semantic generator. This generator, trained via data free knowledge transfer under global model supervision, enhances client data diversity by producing both seen and unseen samples, the latter enabled by Zero-Shot Learning to mitigate dropout-induced data loss. To counter the risks of synthetic data impairing training, we introduce a decoupled model interpolation method, ensuring robust personalization. Extensive experiments demonstrate that AP FL significantly outperforms state of the art FL methods in tackling non-IID distributions and client dropouts, achieving superior accuracy and resilience across diverse real-world scenarios.

View on arXiv
@article{wan2025_2503.00407,
  title={ Asynchronous Personalized Federated Learning through Global Memorization },
  author={ Fan Wan and Yuchen Li and Xueqi Qiu and Rui Sun and Leyuan Zhang and Xingyu Miao and Tianyu Zhang and Haoran Duan and Yang Long },
  journal={arXiv preprint arXiv:2503.00407},
  year={ 2025 }
}
Comments on this paper