ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.19805
5
0

Translation-Equivariance of Normalization Layers and Aliasing in Convolutional Neural Networks

26 May 2025
Jérémy Scanvic
Quentin Barthélemy
Julián Tachella
ArXivPDFHTML
Abstract

The design of convolutional neural architectures that are exactly equivariant to continuous translations is an active field of research. It promises to benefit scientific computing, notably by making existing imaging systems more physically accurate. Most efforts focus on the design of downsampling/pooling layers, upsampling layers and activation functions, but little attention is dedicated to normalization layers. In this work, we present a novel theoretical framework for understanding the equivariance of normalization layers to discrete shifts and continuous translations. We also determine necessary and sufficient conditions for normalization layers to be equivariant in terms of the dimensions they operate on. Using real feature maps from ResNet-18 and ImageNet, we test those theoretical results empirically and find that they are consistent with our predictions.

View on arXiv
@article{scanvic2025_2505.19805,
  title={ Translation-Equivariance of Normalization Layers and Aliasing in Convolutional Neural Networks },
  author={ Jérémy Scanvic and Quentin Barthélemy and Julián Tachella },
  journal={arXiv preprint arXiv:2505.19805},
  year={ 2025 }
}
Comments on this paper