Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1804.05965
Cited By
MaxGain: Regularisation of Neural Networks by Constraining Activation Magnitudes
16 April 2018
Henry Gouk
Bernhard Pfahringer
E. Frank
M. Cree
Re-assign community
ArXiv
PDF
HTML
Papers citing
"MaxGain: Regularisation of Neural Networks by Constraining Activation Magnitudes"
7 / 7 papers shown
Title
Nonlinearity Enhanced Adaptive Activation Functions
David Yevick
65
1
0
29 Mar 2024
Regularisation of Neural Networks by Enforcing Lipschitz Continuity
Henry Gouk
E. Frank
Bernhard Pfahringer
M. Cree
170
478
0
12 Apr 2018
Concrete Dropout
Y. Gal
Jiri Hron
Alex Kendall
BDL
UQCV
179
592
0
22 May 2017
Improved Training of Wasserstein GANs
Ishaan Gulrajani
Faruk Ahmed
Martín Arjovsky
Vincent Dumoulin
Aaron Courville
GAN
207
9,548
0
31 Mar 2017
Understanding deep learning requires rethinking generalization
Chiyuan Zhang
Samy Bengio
Moritz Hardt
Benjamin Recht
Oriol Vinyals
HAI
339
4,629
0
10 Nov 2016
Wide Residual Networks
Sergey Zagoruyko
N. Komodakis
340
7,985
0
23 May 2016
Variational Dropout and the Local Reparameterization Trick
Diederik P. Kingma
Tim Salimans
Max Welling
BDL
226
1,514
0
08 Jun 2015
1