ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2003.07845
  4. Cited By
PowerNorm: Rethinking Batch Normalization in Transformers

PowerNorm: Rethinking Batch Normalization in Transformers

17 March 2020
Sheng Shen
Z. Yao
A. Gholami
Michael W. Mahoney
Kurt Keutzer
    BDL
ArXivPDFHTML

Papers citing "PowerNorm: Rethinking Batch Normalization in Transformers"

3 / 3 papers shown
Title
Batch Layer Normalization, A new normalization layer for CNNs and RNN
Batch Layer Normalization, A new normalization layer for CNNs and RNN
A. Ziaee
Erion cCano
19
12
0
19 Sep 2022
GraphNorm: A Principled Approach to Accelerating Graph Neural Network
  Training
GraphNorm: A Principled Approach to Accelerating Graph Neural Network Training
Tianle Cai
Shengjie Luo
Keyulu Xu
Di He
Tie-Yan Liu
Liwei Wang
GNN
32
158
0
07 Sep 2020
IsoBN: Fine-Tuning BERT with Isotropic Batch Normalization
IsoBN: Fine-Tuning BERT with Isotropic Batch Normalization
Wenxuan Zhou
Bill Yuchen Lin
Xiang Ren
6
24
0
02 May 2020
1