ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1703.00560
  4. Cited By
An Analytical Formula of Population Gradient for two-layered ReLU
  network and its Applications in Convergence and Critical Point Analysis

An Analytical Formula of Population Gradient for two-layered ReLU network and its Applications in Convergence and Critical Point Analysis

2 March 2017
Yuandong Tian
    MLT
ArXivPDFHTML

Papers citing "An Analytical Formula of Population Gradient for two-layered ReLU network and its Applications in Convergence and Critical Point Analysis"

4 / 4 papers shown
Title
Nonparametric Learning of Two-Layer ReLU Residual Units
Nonparametric Learning of Two-Layer ReLU Residual Units
Zhunxuan Wang
Linyun He
Chunchuan Lyu
Shay B. Cohen
MLT
OffRL
90
1
0
17 Aug 2020
Universal Statistics of Fisher Information in Deep Neural Networks: Mean
  Field Approach
Universal Statistics of Fisher Information in Deep Neural Networks: Mean Field Approach
Ryo Karakida
S. Akaho
S. Amari
FedML
106
142
0
04 Jun 2018
Learning ReLUs via Gradient Descent
Learning ReLUs via Gradient Descent
Mahdi Soltanolkotabi
MLT
59
181
0
10 May 2017
Exact solutions to the nonlinear dynamics of learning in deep linear
  neural networks
Exact solutions to the nonlinear dynamics of learning in deep linear neural networks
Andrew M. Saxe
James L. McClelland
Surya Ganguli
ODL
105
1,830
0
20 Dec 2013
1