ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.08145
33
0

Multi-Layer Hierarchical Federated Learning with Quantization

13 May 2025
Seyed Mohammad Azimi-Abarghouyi
Carlo Fischione
    FedML
ArXivPDFHTML
Abstract

Almost all existing hierarchical federated learning (FL) models are limited to two aggregation layers, restricting scalability and flexibility in complex, large-scale networks. In this work, we propose a Multi-Layer Hierarchical Federated Learning framework (QMLHFL), which appears to be the first study that generalizes hierarchical FL to arbitrary numbers of layers and network architectures through nested aggregation, while employing a layer-specific quantization scheme to meet communication constraints. We develop a comprehensive convergence analysis for QMLHFL and derive a general convergence condition and rate that reveal the effects of key factors, including quantization parameters, hierarchical architecture, and intra-layer iteration counts. Furthermore, we determine the optimal number of intra-layer iterations to maximize the convergence rate while meeting a deadline constraint that accounts for both communication and computation times. Our results show that QMLHFL consistently achieves high learning accuracy, even under high data heterogeneity, and delivers notably improved performance when optimized, compared to using randomly selected values.

View on arXiv
@article{azimi-abarghouyi2025_2505.08145,
  title={ Multi-Layer Hierarchical Federated Learning with Quantization },
  author={ Seyed Mohammad Azimi-Abarghouyi and Carlo Fischione },
  journal={arXiv preprint arXiv:2505.08145},
  year={ 2025 }
}
Comments on this paper