Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1802.07042
Cited By
Do deep nets really need weight decay and dropout?
20 February 2018
Alex Hernández-García
Peter König
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Do deep nets really need weight decay and dropout?"
4 / 4 papers shown
Title
A Data-Augmentation Is Worth A Thousand Samples: Exact Quantification From Analytical Augmented Sample Moments
Randall Balestriero
Ishan Misra
Yann LeCun
35
20
0
16 Feb 2022
Further advantages of data augmentation on convolutional neural networks
Alex Hernández-García
Peter König
8
107
0
26 Jun 2019
Data augmentation instead of explicit regularization
Alex Hernández-García
Peter König
30
141
0
11 Jun 2018
A disciplined approach to neural network hyper-parameters: Part 1 -- learning rate, batch size, momentum, and weight decay
L. Smith
208
1,020
0
26 Mar 2018
1