ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1802.07042
  4. Cited By
Do deep nets really need weight decay and dropout?

Do deep nets really need weight decay and dropout?

20 February 2018
Alex Hernández-García
Peter König
ArXivPDFHTML

Papers citing "Do deep nets really need weight decay and dropout?"

4 / 4 papers shown
Title
A Data-Augmentation Is Worth A Thousand Samples: Exact Quantification
  From Analytical Augmented Sample Moments
A Data-Augmentation Is Worth A Thousand Samples: Exact Quantification From Analytical Augmented Sample Moments
Randall Balestriero
Ishan Misra
Yann LeCun
35
20
0
16 Feb 2022
Further advantages of data augmentation on convolutional neural networks
Further advantages of data augmentation on convolutional neural networks
Alex Hernández-García
Peter König
8
107
0
26 Jun 2019
Data augmentation instead of explicit regularization
Data augmentation instead of explicit regularization
Alex Hernández-García
Peter König
30
141
0
11 Jun 2018
A disciplined approach to neural network hyper-parameters: Part 1 --
  learning rate, batch size, momentum, and weight decay
A disciplined approach to neural network hyper-parameters: Part 1 -- learning rate, batch size, momentum, and weight decay
L. Smith
208
1,020
0
26 Mar 2018
1