Inclusive, Differentially Private Federated Learning for Clinical Data

Federated Learning (FL) offers a promising approach for training clinical AI models without centralizing sensitive patient data. However, its real-world adoption is hindered by challenges related to privacy, resource constraints, and compliance. Existing Differential Privacy (DP) approaches often apply uniform noise, which disproportionately degrades model performance, even among well-compliant institutions. In this work, we propose a novel compliance-aware FL framework that enhances DP by adaptively adjusting noise based on quantifiable client compliance scores. Additionally, we introduce a compliance scoring tool based on key healthcare and security standards to promote secure, inclusive, and equitable participation across diverse clinical settings. Extensive experiments on public datasets demonstrate that integrating under-resourced, less compliant clinics with highly regulated institutions yields accuracy improvements of up to 15% over traditional FL. This work advances FL by balancing privacy, compliance, and performance, making it a viable solution for real-world clinical workflows in global healthcare.
View on arXiv@article{parampottupadam2025_2505.22108, title={ Inclusive, Differentially Private Federated Learning for Clinical Data }, author={ Santhosh Parampottupadam and Melih Coşğun and Sarthak Pati and Maximilian Zenk and Saikat Roy and Dimitrios Bounias and Benjamin Hamm and Sinem Sav and Ralf Floca and Klaus Maier-Hein }, journal={arXiv preprint arXiv:2505.22108}, year={ 2025 } }