ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2304.11692
  4. Cited By
The Disharmony between BN and ReLU Causes Gradient Explosion, but is
  Offset by the Correlation between Activations

The Disharmony between BN and ReLU Causes Gradient Explosion, but is Offset by the Correlation between Activations

23 April 2023
Inyoung Paik
Jaesik Choi
ArXivPDFHTML

Papers citing "The Disharmony between BN and ReLU Causes Gradient Explosion, but is Offset by the Correlation between Activations"

2 / 2 papers shown
Title
Opacus: User-Friendly Differential Privacy Library in PyTorch
Opacus: User-Friendly Differential Privacy Library in PyTorch
Ashkan Yousefpour
I. Shilov
Alexandre Sablayrolles
Davide Testuggine
Karthik Prasad
...
Sayan Gosh
Akash Bharadwaj
Jessica Zhao
Graham Cormode
Ilya Mironov
VLM
159
349
0
25 Sep 2021
High-Performance Large-Scale Image Recognition Without Normalization
High-Performance Large-Scale Image Recognition Without Normalization
Andrew Brock
Soham De
Samuel L. Smith
Karen Simonyan
VLM
223
512
0
11 Feb 2021
1