16
0

Hubs and Spokes Learning: Efficient and Scalable Collaborative Machine Learning

Atul Sharma
Kavindu Herath
Saurabh Bagchi
Chaoyue Liu
Somali Chaterji
Abstract

We introduce the Hubs and Spokes Learning (HSL) framework, a novel paradigm for collaborative machine learning that combines the strengths of Federated Learning (FL) and Decentralized Learning (P2PL). HSL employs a two-tier communication structure that avoids the single point of failure inherent in FL and outperforms the state-of-the-art P2PL framework, Epidemic Learning Local (ELL). At equal communication budgets (total edges), HSL achieves higher performance than ELL, while at significantly lower communication budgets, it can match ELL's performance. For instance, with only 400 edges, HSL reaches the same test accuracy that ELL achieves with 1000 edges for 100 peers (spokes) on CIFAR-10, demonstrating its suitability for resource-constrained systems. HSL also achieves stronger consensus among nodes after mixing, resulting in improved performance with fewer training rounds. We substantiate these claims through rigorous theoretical analyses and extensive experimental results, showcasing HSL's practicality for large-scale collaborative learning.

View on arXiv
@article{sharma2025_2504.20988,
  title={ Hubs and Spokes Learning: Efficient and Scalable Collaborative Machine Learning },
  author={ Atul Sharma and Kavindu Herath and Saurabh Bagchi and Chaoyue Liu and Somali Chaterji },
  journal={arXiv preprint arXiv:2504.20988},
  year={ 2025 }
}
Comments on this paper