Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1911.10988
Cited By
v1
v2 (latest)
Sparsity through evolutionary pruning prevents neuronal networks from overfitting
7 November 2019
Richard C. Gerum
A. Erpenbeck
P. Krauss
A. Schilling
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Sparsity through evolutionary pruning prevents neuronal networks from overfitting"
9 / 9 papers shown
Title
Always-Sparse Training by Growing Connections with Guided Stochastic Exploration
Mike Heddes
Narayan Srinivasa
T. Givargis
Alexandru Nicolau
249
0
0
12 Jan 2024
pylustrator: Code generation for reproducible figures for publication
Richard C. Gerum
16
22
0
01 Oct 2019
How deep is deep enough? -- Quantifying class separability in the hidden layers of deep neural networks
Junhong Lin
C. Metzner
Andreas K. Maier
Volkan Cevher
Holger Schulze
Patrick Krauss
56
59
0
05 Nov 2018
Scalable Training of Artificial Neural Networks with Adaptive Sparse Connectivity inspired by Network Science
Decebal Constantin Mocanu
Elena Mocanu
Peter Stone
Phuong H. Nguyen
M. Gibescu
A. Liotta
178
634
0
15 Jul 2017
Playing FPS Games with Deep Reinforcement Learning
Guillaume Lample
Devendra Singh Chaplot
OffRL
EgoV
89
585
0
18 Sep 2016
Learning Structured Sparsity in Deep Neural Networks
W. Wen
Chunpeng Wu
Yandan Wang
Yiran Chen
Hai Helen Li
187
2,340
0
12 Aug 2016
Structured Pruning of Deep Convolutional Neural Networks
S. Anwar
Kyuyeon Hwang
Wonyong Sung
126
748
0
29 Dec 2015
Deep Compression: Compressing Deep Neural Networks with Pruning, Trained Quantization and Huffman Coding
Song Han
Huizi Mao
W. Dally
3DGS
263
8,854
0
01 Oct 2015
On the difficulty of training Recurrent Neural Networks
Razvan Pascanu
Tomas Mikolov
Yoshua Bengio
ODL
204
5,360
0
21 Nov 2012
1