Topology-Aware Knowledge Propagation in Decentralized Learning

Decentralized learning enables collaborative training of models across naturally distributed data without centralized coordination or maintenance of a global model. Instead, devices are organized in arbitrary communication topologies, in which they can only communicate with neighboring devices. Each device maintains its own local model by training on its local data and integrating new knowledge via model aggregation with neighbors. Therefore, knowledge is propagated across the topology via successive aggregation rounds. We study, in particular, the propagation of out-of-distribution (OOD) knowledge. We find that popular decentralized learning algorithms struggle to propagate OOD knowledge effectively to all devices. Further, we find that both the location of OOD data within a topology, and the topology itself, significantly impact OOD knowledge propagation. We then propose topology-aware aggregation strategies to accelerate (OOD) knowledge propagation across devices. These strategies improve OOD data accuracy, compared to topology-unaware baselines, by 123% on average across models in a topology.
View on arXiv@article{sakarvadia2025_2505.11760, title={ Topology-Aware Knowledge Propagation in Decentralized Learning }, author={ Mansi Sakarvadia and Nathaniel Hudson and Tian Li and Ian Foster and Kyle Chard }, journal={arXiv preprint arXiv:2505.11760}, year={ 2025 } }