5
0

Hybrid Batch Normalisation: Resolving the Dilemma of Batch Normalisation in Federated Learning

Abstract

Batch Normalisation (BN) is widely used in conventional deep neural network training to harmonise the input-output distributions for each batch of data. However, federated learning, a distributed learning paradigm, faces the challenge of dealing with non-independent and identically distributed data among the client nodes. Due to the lack of a coherent methodology for updating BN statistical parameters, standard BN degrades the federated learning performance. To this end, it is urgent to explore an alternative normalisation solution for federated learning. In this work, we resolve the dilemma of the BN layer in federated learning by developing a customised normalisation approach, Hybrid Batch Normalisation (HBN). HBN separates the update of statistical parameters (i.e. , means and variances used for evaluation) from that of learnable parameters (i.e. , parameters that require gradient updates), obtaining unbiased estimates of global statistical parameters in distributed scenarios. In contrast with the existing solutions, we emphasise the supportive power of global statistics for federated learning. The HBN layer introduces a learnable hybrid distribution factor, allowing each computing node to adaptively mix the statistical parameters of the current batch with the global statistics. Our HBN can serve as a powerful plugin to advance federated learning performance. It reflects promising merits across a wide range of federated learning settings, especially for small batch sizes and heterogeneous data.

View on arXiv
@article{chen2025_2505.21877,
  title={ Hybrid Batch Normalisation: Resolving the Dilemma of Batch Normalisation in Federated Learning },
  author={ Hongyao Chen and Tianyang Xu and Xiaojun Wu and Josef Kittler },
  journal={arXiv preprint arXiv:2505.21877},
  year={ 2025 }
}
Comments on this paper