ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.17913
63
0

Batch normalization does not improve initialization

25 February 2025
Joris Dannemann
Gero Junike
    ODL
ArXivPDFHTML
Abstract

Batch normalization is one of the most important regularization techniques for neural networks, significantly improving training by centering the layers of the neural network. There have been several attempts to provide a theoretical justification for batch ormalization. Santurkar and Tsipras (2018) [How does batch normalization help optimization? Advances in neural information rocessing systems, 31] claim that batch normalization improves initialization. We provide a counterexample showing that this claim s not true, i.e., batch normalization does not improve initialization.

View on arXiv
@article{dannemann2025_2502.17913,
  title={ Batch normalization does not improve initialization },
  author={ Joris Dannemann and Gero Junike },
  journal={arXiv preprint arXiv:2502.17913},
  year={ 2025 }
}
Comments on this paper