ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2007.07213
  4. Cited By
Plateau Phenomenon in Gradient Descent Training of ReLU networks:
  Explanation, Quantification and Avoidance

Plateau Phenomenon in Gradient Descent Training of ReLU networks: Explanation, Quantification and Avoidance

14 July 2020
M. Ainsworth
Yeonjong Shin
    ODL
ArXiv (abs)PDFHTML

Papers citing "Plateau Phenomenon in Gradient Descent Training of ReLU networks: Explanation, Quantification and Avoidance"

2 / 2 papers shown
Title
On the Omnipresence of Spurious Local Minima in Certain Neural Network
  Training Problems
On the Omnipresence of Spurious Local Minima in Certain Neural Network Training Problems
C. Christof
Julia Kowalczyk
82
8
0
23 Feb 2022
Deep Kronecker neural networks: A general framework for neural networks
  with adaptive activation functions
Deep Kronecker neural networks: A general framework for neural networks with adaptive activation functions
Ameya Dilip Jagtap
Yeonjong Shin
Kenji Kawaguchi
George Karniadakis
ODL
111
137
0
20 May 2021
1