286

FLex&Chill: Improving Local Federated Learning Training with Logit Chilling

Main:33 Pages
26 Figures
Bibliography:6 Pages
9 Tables
Appendix:17 Pages
Abstract

Federated learning are inherently hampered by data heterogeneity: non-iid distributed training data over local clients. We propose a novel model training approach for federated learning, FLex&Chill, which exploits the Logit Chilling method. Through extensive evaluations, we demonstrate that, in the presence of non-iid data characteristics inherent in federated learning systems, this approach can expedite model convergence and improve inference accuracy. Quantitatively, from our experiments, we observe up to 6X improvement in the global federated learning model convergence time, and up to 3.37% improvement in inference accuracy.

View on arXiv
Comments on this paper