ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2110.01765
  4. Cited By
Rapid training of deep neural networks without skip connections or
  normalization layers using Deep Kernel Shaping

Rapid training of deep neural networks without skip connections or normalization layers using Deep Kernel Shaping

5 October 2021
James Martens
Andy Ballard
Guillaume Desjardins
G. Swirszcz
Valentin Dalibard
Jascha Narain Sohl-Dickstein
S. Schoenholz
ArXivPDFHTML

Papers citing "Rapid training of deep neural networks without skip connections or normalization layers using Deep Kernel Shaping"

7 / 7 papers shown
Title
Frequency and Generalisation of Periodic Activation Functions in Reinforcement Learning
Frequency and Generalisation of Periodic Activation Functions in Reinforcement Learning
Augustine N. Mavor-Parker
Matthew J. Sargent
Caswell Barry
Lewis D. Griffin
Clare Lyle
37
2
0
09 Jul 2024
Understanding and Minimising Outlier Features in Neural Network Training
Understanding and Minimising Outlier Features in Neural Network Training
Bobby He
Lorenzo Noci
Daniele Paliotta
Imanol Schlag
Thomas Hofmann
34
3
0
29 May 2024
Width and Depth Limits Commute in Residual Networks
Width and Depth Limits Commute in Residual Networks
Soufiane Hayou
Greg Yang
42
14
0
01 Feb 2023
A Closer Look at Learned Optimization: Stability, Robustness, and
  Inductive Biases
A Closer Look at Learned Optimization: Stability, Robustness, and Inductive Biases
James Harrison
Luke Metz
Jascha Narain Sohl-Dickstein
44
22
0
22 Sep 2022
Gaussian Pre-Activations in Neural Networks: Myth or Reality?
Gaussian Pre-Activations in Neural Networks: Myth or Reality?
Pierre Wolinski
Julyan Arbel
AI4CE
68
8
0
24 May 2022
Deep Learning without Shortcuts: Shaping the Kernel with Tailored
  Rectifiers
Deep Learning without Shortcuts: Shaping the Kernel with Tailored Rectifiers
Guodong Zhang
Aleksandar Botev
James Martens
OffRL
21
26
0
15 Mar 2022
On the Optimization Landscape of Neural Collapse under MSE Loss: Global
  Optimality with Unconstrained Features
On the Optimization Landscape of Neural Collapse under MSE Loss: Global Optimality with Unconstrained Features
Jinxin Zhou
Xiao Li
Tian Ding
Chong You
Qing Qu
Zhihui Zhu
22
97
0
02 Mar 2022
1