27
1
v1v2 (latest)

Sparse Decentralized Federated Learning

Abstract

Decentralized Federated Learning (DFL) enables collaborative model training without a central server but faces challenges in efficiency, stability, and trustworthiness due to communication and computational limitations among distributed nodes. To address these critical issues, we introduce a sparsity constraint on the shared model, leading to Sparse DFL (SDFL), and propose a novel algorithm, CEPS. The sparsity constraint facilitates the use of one-bit compressive sensing to transmit one-bit information between partially selected neighbour nodes at specific steps, thereby significantly improving communication efficiency. Moreover, we integrate differential privacy into the algorithm to ensure privacy preservation and bolster the trustworthiness of the learning process. Furthermore, CEPS is underpinned by theoretical guarantees regarding both convergence and privacy. Numerical experiments validate the effectiveness of the proposed algorithm in improving communication and computation efficiency while maintaining a high level of trustworthiness.

View on arXiv
@article{sha2025_2308.16671,
  title={ Sparse Decentralized Federated Learning },
  author={ Shan Sha and Shenglong Zhou and Lingchen Kong and Geoffrey Ye Li },
  journal={arXiv preprint arXiv:2308.16671},
  year={ 2025 }
}
Comments on this paper