ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2203.14646
  4. Cited By
To Fold or Not to Fold: a Necessary and Sufficient Condition on
  Batch-Normalization Layers Folding

To Fold or Not to Fold: a Necessary and Sufficient Condition on Batch-Normalization Layers Folding

28 March 2022
Edouard Yvinec
Arnaud Dapogny
Kévin Bailly
ArXivPDFHTML

Papers citing "To Fold or Not to Fold: a Necessary and Sufficient Condition on Batch-Normalization Layers Folding"

2 / 2 papers shown
Title
SInGE: Sparsity via Integrated Gradients Estimation of Neuron Relevance
SInGE: Sparsity via Integrated Gradients Estimation of Neuron Relevance
Edouard Yvinec
Arnaud Dapogny
Matthieu Cord
Kévin Bailly
42
9
0
08 Jul 2022
Comparing Rewinding and Fine-tuning in Neural Network Pruning
Comparing Rewinding and Fine-tuning in Neural Network Pruning
Alex Renda
Jonathan Frankle
Michael Carbin
229
383
0
05 Mar 2020
1