Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2011.10520
Cited By
Rethinking Weight Decay For Efficient Neural Network Pruning
20 November 2020
Hugo Tessier
Vincent Gripon
Mathieu Léonardon
M. Arzel
T. Hannagan
David Bertrand
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Rethinking Weight Decay For Efficient Neural Network Pruning"
10 / 60 papers shown
Title
Deep Compression: Compressing Deep Neural Networks with Pruning, Trained Quantization and Huffman Coding
Song Han
Huizi Mao
W. Dally
3DGS
239
8,821
0
01 Oct 2015
Data-free parameter pruning for Deep Neural Networks
Suraj Srinivas
R. Venkatesh Babu
3DPC
70
547
0
22 Jul 2015
Learning both Weights and Connections for Efficient Neural Networks
Song Han
Jeff Pool
J. Tran
W. Dally
CVBM
294
6,660
0
08 Jun 2015
Distilling the Knowledge in a Neural Network
Geoffrey E. Hinton
Oriol Vinyals
J. Dean
FedML
316
19,609
0
09 Mar 2015
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
Sergey Ioffe
Christian Szegedy
OOD
428
43,234
0
11 Feb 2015
Adam: A Method for Stochastic Optimization
Diederik P. Kingma
Jimmy Ba
ODL
1.5K
149,842
0
22 Dec 2014
Going Deeper with Convolutions
Christian Szegedy
Wei Liu
Yangqing Jia
P. Sermanet
Scott E. Reed
Dragomir Anguelov
D. Erhan
Vincent Vanhoucke
Andrew Rabinovich
409
43,589
0
17 Sep 2014
Very Deep Convolutional Networks for Large-Scale Image Recognition
Karen Simonyan
Andrew Zisserman
FAtt
MDE
1.4K
100,213
0
04 Sep 2014
Exploiting Linear Structure Within Convolutional Networks for Efficient Evaluation
Emily L. Denton
Wojciech Zaremba
Joan Bruna
Yann LeCun
Rob Fergus
FAtt
165
1,688
0
02 Apr 2014
Network In Network
Min Lin
Qiang Chen
Shuicheng Yan
279
6,274
0
16 Dec 2013
Previous
1
2