Personalized Bayesian Federated Learning with Wasserstein Barycenter Aggregation

Personalized Bayesian federated learning (PBFL) handles non-i.i.d. client data and quantifies uncertainty by combining personalization with Bayesian inference. However, existing PBFL methods face two limitations: restrictive parametric assumptions in client posterior inference and naive parameter averaging for server aggregation. To overcome these issues, we propose FedWBA, a novel PBFL method that enhances both local inference and global aggregation. At the client level, we use particle-based variational inference for nonparametric posterior representation. At the server level, we introduce particle-based Wasserstein barycenter aggregation, offering a more geometrically meaningful approach. Theoretically, we provide local and global convergence guarantees for FedWBA. Locally, we prove a KL divergence decrease lower bound per iteration for variational inference convergence. Globally, we show that the Wasserstein barycenter converges to the true parameter as the client data size increases. Empirically, experiments show that FedWBA outperforms baselines in prediction accuracy, uncertainty calibration, and convergence rate, with ablation studies confirming its robustness.
View on arXiv@article{wei2025_2505.14161, title={ Personalized Bayesian Federated Learning with Wasserstein Barycenter Aggregation }, author={ Ting Wei and Biao Mei and Junliang Lyu and Renquan Zhang and Feng Zhou and Yifan Sun }, journal={arXiv preprint arXiv:2505.14161}, year={ 2025 } }