60
0

Addressing the Collaboration Dilemma in Low-Data Federated Learning via Transient Sparsity

Main:10 Pages
11 Figures
Bibliography:5 Pages
7 Tables
Appendix:6 Pages
Abstract

Federated learning (FL) enables collaborative model training across decentralized clients while preserving data privacy, leveraging aggregated updates to build robust global models. However, this training paradigm faces significant challenges due to data heterogeneity and limited local datasets, which often impede effective collaboration. In such scenarios, we identify the Layer-wise Inertia Phenomenon in FL, wherein the middle layers of global model undergo minimal updates after early communication rounds, ultimately limiting the effectiveness of global aggregation. We demonstrate the presence of this phenomenon across a wide range of federated settings, spanning diverse datasets and architectures. To address this issue, we propose LIPS (Layer-wise Inertia Phenomenon with Sparsity), a simple yet effective method that periodically introduces transient sparsity to stimulate meaningful updates and empower global aggregation. Experiments demonstrate that LIPS effectively mitigates layer-wise inertia, enhances aggregation effectiveness, and improves overall performance in various FL scenarios. This work not only deepens the understanding of layer-wise learning dynamics in FL but also paves the way for more effective collaboration strategies in resource-constrained environments. Our code is publicly available at:this https URL.

View on arXiv
@article{xiao2025_2506.00932,
  title={ Addressing the Collaboration Dilemma in Low-Data Federated Learning via Transient Sparsity },
  author={ Qiao Xiao and Boqian Wu and Andrey Poddubnyy and Elena Mocanu and Phuong H. Nguyen and Mykola Pechenizkiy and Decebal Constantin Mocanu },
  journal={arXiv preprint arXiv:2506.00932},
  year={ 2025 }
}
Comments on this paper