ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.11881
2
0

Revisiting Residual Connections: Orthogonal Updates for Stable and Efficient Deep Networks

17 May 2025
Giyeong Oh
Woohyun Cho
Siyeol Kim
Suhwan Choi
Younjae Yu
ArXivPDFHTML
Abstract

Residual connections are pivotal for deep neural networks, enabling greater depth by mitigating vanishing gradients. However, in standard residual updates, the module's output is directly added to the input stream. This can lead to updates that predominantly reinforce or modulate the existing stream direction, potentially underutilizing the module's capacity for learning entirely novel features. In this work, we introduce Orthogonal Residual Update: we decompose the module's output relative to the input stream and add only the component orthogonal to this stream. This design aims to guide modules to contribute primarily new representational directions, fostering richer feature learning while promoting more efficient training. We demonstrate that our orthogonal update strategy improves generalization accuracy and training stability across diverse architectures (ResNetV2, Vision Transformers) and datasets (CIFARs, TinyImageNet, ImageNet-1k), achieving, for instance, a +4.3\%p top-1 accuracy gain for ViT-B on ImageNet-1k.

View on arXiv
@article{oh2025_2505.11881,
  title={ Revisiting Residual Connections: Orthogonal Updates for Stable and Efficient Deep Networks },
  author={ Giyeong Oh and Woohyun Cho and Siyeol Kim and Suhwan Choi and Younjae Yu },
  journal={arXiv preprint arXiv:2505.11881},
  year={ 2025 }
}
Comments on this paper