ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2109.05075
  4. Cited By
On the Compression of Neural Networks Using $\ell_0$-Norm Regularization
  and Weight Pruning

On the Compression of Neural Networks Using ℓ0\ell_0ℓ0​-Norm Regularization and Weight Pruning

10 September 2021
F. Oliveira
E. Batista
R. Seara
ArXivPDFHTML

Papers citing "On the Compression of Neural Networks Using $\ell_0$-Norm Regularization and Weight Pruning"

3 / 3 papers shown
Title
CCIL: Continuity-based Data Augmentation for Corrective Imitation
  Learning
CCIL: Continuity-based Data Augmentation for Corrective Imitation Learning
Liyiming Ke
Yunchu Zhang
Abhay Deshpande
S. Srinivasa
Abhishek Gupta
OffRL
27
12
0
19 Oct 2023
Pruning and Quantization for Deep Neural Network Acceleration: A Survey
Pruning and Quantization for Deep Neural Network Acceleration: A Survey
Tailin Liang
C. Glossner
Lei Wang
Shaobo Shi
Xiaotong Zhang
MQ
150
675
0
24 Jan 2021
What is the State of Neural Network Pruning?
What is the State of Neural Network Pruning?
Davis W. Blalock
Jose Javier Gonzalez Ortiz
Jonathan Frankle
John Guttag
191
1,027
0
06 Mar 2020
1