Federated Learning with Unlabeled Clients: Personalization Can Happen in Low Dimensions

Personalized federated learning has emerged as a popular approach to training on devices holding statistically heterogeneous data, known as clients. However, most existing approaches require a client to have labeled data for training or finetuning in order to obtain their own personalized model. In this paper we address this by proposing FLowDUP, a novel method that is able to generate a personalized model using only a forward pass with unlabeled data. The generated model parameters reside in a low-dimensional subspace, enabling efficient communication and computation. FLowDUP's learning objective is theoretically motivated by our new transductive multi-task PAC-Bayesian generalization bound, that provides performance guarantees for unlabeled clients. The objective is structured in such a way that it allows both clients with labeled data and clients with only unlabeled data to contribute to the training process. To supplement our theoretical results we carry out a thorough experimental evaluation of FLowDUP, demonstrating strong empirical performance on a range of datasets with differing sorts of statistically heterogeneous clients. Through numerous ablation studies, we test the efficacy of the individual components of the method.
View on arXiv@article{zakerinia2025_2505.15579, title={ Federated Learning with Unlabeled Clients: Personalization Can Happen in Low Dimensions }, author={ Hossein Zakerinia and Jonathan Scott and Christoph H. Lampert }, journal={arXiv preprint arXiv:2505.15579}, year={ 2025 } }