ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2008.07545
  4. Cited By
Whitening and second order optimization both make information in the
  dataset unusable during training, and can reduce or prevent generalization

Whitening and second order optimization both make information in the dataset unusable during training, and can reduce or prevent generalization

17 August 2020
Neha S. Wadia
Daniel Duckworth
S. Schoenholz
Ethan Dyer
Jascha Narain Sohl-Dickstein
ArXivPDFHTML

Papers citing "Whitening and second order optimization both make information in the dataset unusable during training, and can reduce or prevent generalization"

4 / 4 papers shown
Title
Effective Learning with Node Perturbation in Multi-Layer Neural Networks
Effective Learning with Node Perturbation in Multi-Layer Neural Networks
Sander Dalm
Marcel van Gerven
Nasir Ahmad
AAML
27
0
0
02 Oct 2023
General-Purpose In-Context Learning by Meta-Learning Transformers
General-Purpose In-Context Learning by Meta-Learning Transformers
Louis Kirsch
James Harrison
Jascha Narain Sohl-Dickstein
Luke Metz
34
72
0
08 Dec 2022
Amortized Proximal Optimization
Amortized Proximal Optimization
Juhan Bae
Paul Vicol
Jeff Z. HaoChen
Roger C. Grosse
ODL
25
14
0
28 Feb 2022
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train
  10,000-Layer Vanilla Convolutional Neural Networks
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks
Lechao Xiao
Yasaman Bahri
Jascha Narain Sohl-Dickstein
S. Schoenholz
Jeffrey Pennington
220
348
0
14 Jun 2018
1