ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.08813
  4. Cited By
ReLU soothes the NTK condition number and accelerates optimization for
  wide neural networks

ReLU soothes the NTK condition number and accelerates optimization for wide neural networks

15 May 2023
Chaoyue Liu
Like Hui
    MLT
ArXivPDFHTML

Papers citing "ReLU soothes the NTK condition number and accelerates optimization for wide neural networks"

2 / 2 papers shown
Title
Theoretical characterisation of the Gauss-Newton conditioning in Neural Networks
Theoretical characterisation of the Gauss-Newton conditioning in Neural Networks
Jim Zhao
Sidak Pal Singh
Aurelien Lucchi
AI4CE
45
0
0
04 Nov 2024
Fast Training of Sinusoidal Neural Fields via Scaling Initialization
Fast Training of Sinusoidal Neural Fields via Scaling Initialization
Taesun Yeom
Sangyoon Lee
Jaeho Lee
55
2
0
07 Oct 2024
1